ECSS-Q-ST-80C
6 March 2009
Space product
assurance
Software product assurance
ECSS Secretariat
ESA-ESTEC
Requirements & Standards Division
Noordwijk, The Netherlands
ECSSQST80C
6March2009
Foreword
This Standard is one of the series of ECSS Standards intended to be applied together for the
management, engineering and product assurance in space projects and applications. ECSS is a
cooperative effort of the European Space Agency, national space agencies and European industry
associationsforthepurposeofdevelopingandmaintainingcommonstandards.Requirementsinthis
Standardaredefinedintermsofwhatshallbeaccomplished,ratherthanintermsofhowtoorganize
and perform the necessary work. This allows existing organizational structures and methods to be
appliedwherethey are effective, and for the structures and methods to evolve as necessary without
rewritingthestandards.
This Standard has been prepared by the ECSSQST80C Working Group, reviewed by the ECSS
ExecutiveSecretariatandapprovedbytheECSSTechnicalAuthority.
Disclaimer
ECSSdoesnotprovideanywarrantywhatsoever,whetherexpressed,implied,orstatutory,including,
butnotlimitedto,anywarrantyofmerchantabilityorfitnessforaparticularpurposeoranywarranty
that the contents of the item are errorfree. In no respect shall ECSS incur any liability for any
damages,including,butnotlimitedto, direct,indirect,special,orconsequentialdamagesarisingout
of, resulting from, or in any way connected to the use of this Standard, whether or not based upon
warranty, business agreement,tort, or otherwise; whether or not injury wassustained bypersons or
propertyorotherwise;andwhetherornotlosswassustainedfrom,oraroseoutof,theresultsof,the
item,oranyservicesthatmaybeprovidedbyECSS.
Publishedby: ESARequirementsandStandardsDivision
ESTEC, P.O. Box 299,
2200 AG Noordwijk
The Netherlands
Copyright: 2009 © by the European Space Agency for the members of ECSS
2
ECSSQST80C
6March2009
Change log
ECSSQ80A
19April1996
Firstissue
ECSSQ80B
10October2003
Secondissue
ECSSQST80C
6March2009
Thirdissue
Mainchangeswithrespecttopreviousversionare:
definitionofsoftwarecriticalitycategoriesandtailoringoftheStandard
basedonthose;
improvementofrequirementsonreuseofsoftware,softwaresafety
anddependabilityandsoftwareprocessassessmentandimprovement;
streamliningofrequirementstomaketheStandardmoresuitablefor
directuseinbusinessagreements.
3
ECSSQST80C
6March2009
Table of contents
Change log.................................................................................................................3
1 Scope.......................................................................................................................8
2 Normative references.............................................................................................9
3 Terms, definitions and abbreviated terms..........................................................10
3.1 Terms for other standards ........................................................................................10
3.2 Terms specific to the present standard ....................................................................10
3.3 Abbreviated terms ....................................................................................................16
4 Space system software product assurance principles.....................................18
4.1 Introduction...............................................................................................................18
4.2 Organization of this Standard...................................................................................19
4.3 Tailoring of this Standard .........................................................................................21
5 Software product assurance programme implementation ...............................22
5.1 Organization and responsibility ................................................................................22
5.1.1 Organization ...............................................................................................22
5.1.2 Responsibility and authority........................................................................22
5.1.3 Resources ..................................................................................................23
5.1.4 Software product assurance manager/engineer.........................................23
5.1.5 Training.......................................................................................................23
5.2 Software product assurance programme management ...........................................24
5.2.1 Software product assurance planning and control .....................................24
5.2.2 Software product assurance reporting........................................................25
5.2.3 Audits..........................................................................................................26
5.2.4 Alerts ..........................................................................................................26
5.2.5 Software problems......................................................................................26
5.2.6 Nonconformances ......................................................................................27
5.2.7 Quality requirements and quality models ...................................................27
5.3 Risk management and critical item control...............................................................28
5.3.1 Risk management.......................................................................................28
4
ECSSQST80C
6March2009
5.3.2 Critical item control.....................................................................................28
5.4 Supplier selection and control ..................................................................................28
5.4.1 Supplier selection .......................................................................................28
5.4.2 Supplier requirements ................................................................................29
5.4.3 Supplier monitoring.....................................................................................29
5.4.4 Criticality classification................................................................................30
5.5 Procurement.............................................................................................................30
5.5.1 Procurement documents ............................................................................30
5.5.2 Review of procured software component list..............................................30
5.5.3 Procurement details....................................................................................30
5.5.4 Identification ...............................................................................................30
5.5.5 Inspection ...................................................................................................30
5.5.6 Exportability................................................................................................31
5.6 Tools and supporting environment ...........................................................................31
5.6.1 Methods and tools ......................................................................................31
5.6.2 Development environment selection ..........................................................31
5.7 Assessment and improvement process....................................................................32
5.7.1 Process assessment ..................................................................................32
5.7.2 Assessment process ..................................................................................33
5.7.3 Process improvement.................................................................................34
6 Software process assurance...............................................................................35
6.1 Software development life cycle...............................................................................35
6.1.1 Life cycle definition .....................................................................................35
6.1.2 Process quality objectives ..........................................................................35
6.1.3 Life cycle definition review..........................................................................35
6.1.4 Life cycle resources....................................................................................35
6.1.5 Software validation process schedule ........................................................36
6.2 Requirements applicable to all software engineering processes..............................36
6.2.1 Documentation of processes......................................................................36
6.2.2 Software dependability and safety..............................................................37
6.2.3 Handling of critical software........................................................................39
6.2.4 Software configuration management..........................................................41
6.2.5 Process metrics..........................................................................................43
6.2.6 Verification..................................................................................................44
6.2.7 Reuse of existing software .........................................................................47
6.2.8 Automatic code generation.........................................................................50
5
ECSSQST80C
6March2009
6.3 Requirements applicable to individual software engineering processes or
activities....................................................................................................................51
6.3.1 Software related system requirements process .........................................51
6.3.2 Software requirements analysis .................................................................51
6.3.3 Software architectural design and design of software items ......................53
6.3.4 Coding ........................................................................................................54
6.3.5 Testing and validation.................................................................................55
6.3.6 Software delivery and acceptance..............................................................60
6.3.7 Operations..................................................................................................61
6.3.8 Maintenance...............................................................................................62
7 Software product quality assurance...................................................................64
7.1 Product quality objectives and metrication ...............................................................64
7.1.1 Deriving of requirements ............................................................................64
7.1.2 Quantitative definition of quality requirements............................................64
7.1.3 Assurance activities for product quality requirements ................................64
7.1.4 Product metrics...........................................................................................64
7.1.5 Basic metrics ..............................................................................................65
7.1.6 Reporting of metrics ...................................................................................65
7.1.7 Numerical accuracy....................................................................................65
7.1.8 Analysis of software maturity......................................................................66
7.2 Product quality requirements....................................................................................66
7.2.1 Requirements baseline and technical specification....................................66
7.2.2 Design and related documentation.............................................................67
7.2.3 Test and validation documentation.............................................................67
7.3 Software intended for reuse .....................................................................................68
7.3.1 Customer requirements..............................................................................68
7.3.2 Separate documentation ............................................................................68
7.3.3 Self-contained information..........................................................................68
7.3.4 Requirements for intended reuse ...............................................................68
7.3.5 Configuration management for intended reuse ..........................................68
7.3.6 Testing on different platforms.....................................................................69
7.3.7 Certificate of conformance..........................................................................69
7.4 Standard ground hardware and services for operational system .............................69
7.4.1 Hardware procurement...............................................................................69
7.4.2 Service procurement ..................................................................................69
7.4.3 Constraints .................................................................................................70
7.4.4 Selection.....................................................................................................70
6
ECSSQST80C
6March2009
7.4.5 Maintenance...............................................................................................70
7.5 Firmware ..................................................................................................................70
7.5.1 Device programming ..................................................................................70
7.5.2 Marking.......................................................................................................71
7.5.3 Calibration ..................................................................................................71
Annex A (informative) Software documentation..................................................72
Annex B (normative) Software product assurance plan (SPAP) - DRD .............78
Annex C (normative) Software product assurance milestone report
(SPAMR) - DRD....................................................................................................85
Annex D (normative) Tailoring of this Standard based on software
criticality ..............................................................................................................88
D.1 Software criticality categories...................................................................................88
D.2 Applicability matrix....................................................................................................88
Annex E (informative) List of requirements with built-in tailoring
capability..............................................................................................................99
Annex F (informative) Document organization and content at each
milestone ...........................................................................................................100
F.1 Introduction.............................................................................................................100
F.2 ECSS-Q-ST-80 Expected Output at SRR ..............................................................100
F.3 ECSS-Q-ST-80 Expected Output at PDR ..............................................................102
F.4 ECSS-Q-ST-80 Expected Output at CDR ..............................................................106
F.5 ECSS-Q-ST-80 Expected Output at QR.................................................................108
F.6 ECSS-Q-ST-80 Expected Output at AR.................................................................109
F.7 ECSS-Q-ST-80 Expected Output not associated with any specific milestone
review.....................................................................................................................111
Bibliography...........................................................................................................113
Figures
Figure 4-1: Software related processes in ECSS Standards .................................................19
Figure 4-2: Structure of this Standard ....................................................................................20
Figure A-1 : Overview of software documents .......................................................................72
Tables
Table A-1 : ECSS-E-ST-40 and ECSS-Q-ST-80 Document requirements list (DRL) ............73
Table B-1 : SPAP traceability to ECSS-E-ST-40 and ECSS-Q-ST-80 clauses......................78
Table C-1 : SPAP traceability to ECSS-E-ST-40 and ECSS-Q-ST-80 clauses .....................85
Table D-1 : Software criticality categories..............................................................................88
Table D-2 : Applicability matrix based on software criticality..................................................89
7
ECSSQST80C
6March2009
1
Scope
This Standard defines a set of software product assurance requirements to be
used for the development and maintenance of software for space systems.
Spacesystemsincludemannedandunmannedspacecraft,launchers,payloads,
experiments and their associated ground equipment and facilities. Software
includesthesoftwarecomponentoffirmware.
This Standard also applies to the development or reuse of nondeliverable
software which affects the quality of the deliverable product or service
providedbyaspacesystem,iftheserviceisimplementedbysoftware.
ECSSQST80 interfaces with space engineering and management, which are
addressedintheEngineering(E)andManagement (M)branchesof the ECSS
System, and explains how they relate to the software product assurance
processes.
Thisstandardmaybetailored forthespecificcharacteristicandconstrainsofa
spaceprojectinconformancewithECSSSST00.
Tailoring of this Standard to a specific business agreement or project, when
software product assurance requirements are prepared, is also addressed in
clause
4.3.
8
ECSSQST80C
6March2009
2
Normative references
The following normative documents contain provisions which, through
reference in this text, constitute provisions of this ECSS Standard. For dated
references,subsequentamendmentsto, orrevisionofanyofthesepublications
donotapply,However,partiestoagreementsbasedonthisECSSStandardare
encouragedtoinvestigatethepossibilityofapplyingthemorerecenteditionsof
the normative documents indicated below. For undated references, the latest
editionofthepublicationreferredtoapplies.
ECSSSST0001 ECSSsystemGlossaryofterms
ECSSEST40 SpaceengineeringSoftwaregeneral
requirements
ECSSQST10 SpaceproductassuranceProductassurance
management
ECSSQST1004 SpaceproductassuranceCriticalitemcontrol
ECSSQST1009 SpaceproductassuranceNonconformance
controlsystem
ECSSQST20 SpaceproductassuranceQualityassurance
ECSSQST30 SpaceproductassuranceDependability
ECSSQST40
SpaceproductassuranceSafety
ECSSMST10 SpaceprojectmanagementProjectplanning
andimplementation
ECSSMST1001 Spaceprojectmanagement–Organizationand
conductofreviews
ECSSMST40 SpaceprojectmanagementConfigurationand
informationmanagement
ECSSMST80 SpaceprojectmanagementRiskm anagement
ISO/IEC15504Part2:2003 Softwareengineering‐Processassessment
Part2:Performinganassessment‐FirstEdition
9
ECSSQST80C
6March2009
3
Terms, definitions and abbreviated terms
3.1 Terms for other standards
ForthepurposeofthisStandard,thetermsanddefinitionsfromECSSST0001
applyinparticularfortheterm:
acceptancetest
softwareproduct
NOTE The terms and definitions are common for the
ECSSEST40andECSSQST80Standards.
3.2 Terms specific to the present standard
3.2.1 automatic code generation
generationofsourcecodewithatoolfromamodel
3.2.2 code coverage
percentageofthesoftwarethathasbeenexecuted(covered)bythetestsuite
3.2.3 competent assessor
person who has demonstrated the necessary skills, competencies and
experiencetoleadaprocessassessmentinconformancewithISO/IEC15504
NOTE AdaptedfromISO/IEC15504:1998,Part9.
3.2.4 condition
booleanexpressionnotcontainingbooleanoperators
3.2.5 configurable code
code (source code or executablecode) that can betailored by settingvalues of
parameters
NOTE This definition covers in particular classes of
configurable code obtained by the following
configurationmeans:
configuration based on the use of a
compilationdirective;
10
ECSSQST80C
6March2009
configuration based on the use of a link
directive;
configuration performed through a
parameterdefinedinaconfigurationfile;
configuration performed through data
defined in a database with impact on the
actually executable parts of the software
(e.g. parameters defining branch structures
that result in the nonexecution of existing
partsofthecode).
3.2.6 COTS, OTS, MOTS software
for the purpose of this Standard, commercialofftheshelf, offtheshelf and
modifiedofftheshelfsoftwareforwhichevidenceofuseisavailable
3.2.7 critical software
softwareofcriticalitycategoryA,BorC
NOTE See ECSSQST80C
Table D1 Software
criticalitycategories.
3.2.8 deactivated code
code that, although incorporated through correct design and coding, is
intendedtoexecuteincertainsoftwareproductconfigurationsonly,orinnone
ofthem
[adaptedfromRTCA/DO178B]
3.2.9 decision
booleanexpressioncomposedofconditionsandzeroormorebooleanoperators
thatareusedinacontrolconstruct.
NOTE1 For example: “if.....then .....else” or the “case”
statementarecontrolconstruct.
NOTE2 A decision without a boolean operator is a
condition.
NOTE3 If a condition appears more than once in a
decision,eachoccurrenceisadistinctcondition.
3.2.10 decision coverage
measureofthepartoftheprogramwithin whicheverypoint ofentryandexit
isinvokedatleastonceandeverydecisionhastaken“true”and“false”values
atleastonce.
NOTE Decision coverage includes, by definition,
statementcoverage.
3.2.11 existing software
anysoftwaredevelopedoutsidethebusinessagreementtowhichthisStandard
isapplicable,includingsoftwarefrompreviousdevelopments provided by the
11
ECSSQST80C
6March2009
supplier, software from previous developments provided by the customer,
COTS,OTSandMOTSsoftware,freewareandopensourcesoftware
3.2.12 integration testing
testing in which software components, hardware components, or both are
combinedandtestedtoevaluatetheinteractionbetweenthem
[IEEE610.12:1990]
3.2.13 logical model
implementationindependent model of software items used to analyse and
documentsoftwarerequirements
3.2.14 margin philosophy
rationale for margins allocated to the performance parameters and computer
resources of a development, and the wayto manage these margins during the
executionoftheproject
3.2.15 metric
definedmeasurementmethodandthemeasurementscale
NOTE1 Metrics can be internal or external, and direct or
indirect.
NOTE2 Metrics include methods for categorising
qualitativedata.
[ISO/IEC91261:2001]
3.2.16 migration
portingofasoftwareproducttoanewenvironment
3.2.17 mission products
productsandservicesdeliveredbythespacesystem
NOTE Forexample:Communicationsservices,science
data.
3.2.18 modified condition and decision coverage
measureofthepartoftheprogramwithin whicheverypoint ofentryandexit
hasbeeninvokedatleastonce,everydecisionintheprogramhastaken“true
and “false” values at least once, and each condition in a decision has been
showntoindependentlyaffectthatdecision’soutcome
NOTE Aconditionisshown to independentlyaffecta
decision’s outcome by varying that condition
while holding fixed all other possible
conditions.
3.2.19 operational
forthepurposeofthisStandard,relatedtothesoftwareoperation
NOTE Itisnotrelatedtothespacecraftoperation.
12
ECSSQST80C
6March2009
3.2.20 portability (a quality characteristic)
capabilityofsoftwaretobetransferredfromoneenvironmenttoanother
3.2.21 quality characteristics (software)
set of attributes of a software product by which its quality is described and
evaluated
NOTE A software quality characteristic can have
multiplelevelsofsubcharacteristics.
3.2.22 quality model (software)
set of characteristics and the relationships between them which provide the
basisforspecifyingqualityrequirementsandevaluatingquality
[ISO/IEC91261:2001]
3.2.23 real-time
pertainingtoasystemormodeofoperationinwhichcomputationisperformed
during the actual time that an external process occurs, in order that the
computation results can be used to control, monitor, or respond in a timely
mannertotheexternalprocess
[IEEE610.12:1990]
3.2.24 regression testing (software)
selective retesting of a system or component to verify that modifications have
notcausedunintended effects andthatthesystemor componentstillcomplies
withitsspecifiedrequirements
[IEEE610.12:1990]
3.2.25 reusability
degree to which a software unit or other work product can be used in more
thanonecomputerprogramorsoftwaresystem
[IEEE610.12:1990]
3.2.26 singular input
inputcorrespondingtoasingularityofthefunction
3.2.27 software
see“softwareproduct”inECSSSST0001
3.2.28 software component
partofasoftwaresystem
NOTE1 Softwarecomponentisusedasageneralterm.
NOTE2 Componentscanbeassembledanddecomposedto
formnewcomponents.Intheproductionactivities,
components are implemented as units, tasks or
programs, any of which can be configuration
items.Thisusage of the termismoregeneral than
13
ECSSQST80C
6March2009
inANSI/IEEE parlance,which definesa
component as a “basic part of a system or
program”; in this Standard, components are not
always“basic”astheycanbedecomposed.
3.2.29 software intensive system
space system in which the dominant part of the constituents are software
elements
NOTE In such systems, subsystems consist mainly of
software. For this type of system, the majority
ofinterfacesaresoftwaresoftwareinterfaces.
3.2.30 software item
see“softwareproduct”inECSSSST0001
3.2.31 software observability
propertyofasystemforwhichthevaluesofstatusvariablescanbedetermined
throughoutobservationsoftheoutputvariables
3.2.32 software problem
conditionofasoftwareproduct that causes difficultyoruncertaintyinthe use
ofthesoftware
[CMU/SEI92TR022]
3.2.33 software product assurance
totality of activities, standards, controls and procedures in the lifetime of a
software product which establishes confidence that the delivered software
product,orsoftwareaffectingthequalityofthedeliveredproduct,conformsto
customerrequirements
3.2.34 software unit
separatelycompilablepieceofsourcecode
NOTE InthisStandardnodistinctionismadebetween
a software unit and a database; both are
coveredbythesamerequirements.
3.2.35 statement coverage
measureofthepartoftheprogramwithinwhicheveryexecutablesourcecode
statementhasbeeninvokedatleastonce.
3.2.36 stress test
test that evaluates a system or software component at or beyond its required
capabilities
14
ECSSQST80C
6March2009
3.2.37 test case
set of test inputs, execution conditions and expected results developed for a
particular objective such as to exercise a particular program path or to verify
compliancewithaspecifiedrequirement
3.2.38 test design
documentationspecifyingthedetailsofthetestapproachforasoftwarefeature
orcombinationofsoftwarefeaturesandidentifyingassociatedtests
3.2.39 test procedure
detailedinstructionsforthesetup,operationandevaluationoftheresultsfora
giventest
3.2.40 test script
file containing a set of commands or instructions written in native format
(computer or tool processable) in order to automate the execution of one or a
combinationoftestprocedures(andtheassociatedevaluationoftheresults)
3.2.41 unit test
testofindividualsoftwareunit
3.2.42 unreachable code
codethatcannotbeexecutedduetodesignorcodingerror
3.2.43 usability (a quality characteristic)
capabilityofthesoftwaretobeunderstood,learned,usedandlikedbytheuser,
whenusedunderspecifiedconditions
3.2.44 validation
<software> process to confirm that the requirements baseline functions and
performancesarecorrectlyandcompletelyimplementedinthefinalproduct
3.2.45 verification
<software> process to confirm that adequatespecificationsandinputs exist for
any activity, and that the outputs of the activities are correct and consistent
withthespecificationsandinput
3.2.46 walk-through
staticanalysistechniqueinwhichadesignerorprogrammerleadsmembersof
thedevelopmentteamandotherinterestedpartiesthroughasoftwareproduct,
and the participants ask questions and make comments about possible errors,
violationofdevelopmentstandards,andotherproblems
[IEEE10281997]
15
ECSSQST80C
6March2009
3.3 Abbreviated terms
For the purpose of this Standard and of ECSSEST40, the abbreviated terms
fromECSSSST0001andthefollowingapply:
ForthedefinitionofDRDacronymssee
AnnexA.
NOTE The abbreviated terms are common for the
ECSSEST40andECSSQST80Standards.
Abbreviation Meaning
AR
acceptancereview
NOTE The term SWAR can be used for clarity
to denote ARs that solely involve software
products.
CDR
criticaldesignreview
NOTE The term SWCDR can be used for
clarity to denote CDRs that solely involve
softwareproducts.
CMMI
capabilitymaturitymodelintegration
COTS
commercialofftheshelf
CPU
centralprocessingunit
DDF
designdefinitionfile
DDR
detaileddesignreview
DJF
designjustificationfile
DRD
documentrequirementsdefinition
ECSS
EuropeanCooperationforSpaceStandardization
eo
expectedoutput
GS
groundsegment
HMI
humanmachineinterface
HSIA
hardwaresoftwareinteractionanalysis
HW
hardware
ICD
interfacecontroldocument
INTRSA
internationalregistrationschemeforassessors
IRD
interfacerequirementsdocument
ISO
InternationalOrganizationforStandardization
ISV
independentsoftwarevalidation
ISVV
independentsoftwareverificationandvalidation
MGT
managementfile
MF
maintenancefile
MOTS
modifiedofftheshelf
16
ECSSQST80C
6March2009
OBCP
onboardcontrolprocedure
OP
operationalplan
ORR
operationalreadinessreview
OTS
offtheshelf
PAF
productassurancefile
PDR
preliminarydesignreview
NOTE ThetermSWPDRcanbeusedforclarity
to denote PDRs that solely involve software
products.
PRR
preliminaryrequirementreview
QR
qualificationreview
NOTE The term SWQR can be used for clarity
to denote QRs that solely involve software
products.
RB
requirementsbaseline
SCAMPI
standardCMMIappraisalmethodforprocess
improvement
SDE
softwaredevelopmentenvironment
SOS
softwareoperationsupport
SPA
softwareproductassurance
SPAMR
softwareproductassurancemilestonereport
SPAP
softwareproductassuranceplan
SPR
softwareproblemreport
SRB
softwarereviewboard
SRR
systemrequirementsreview
NOTE ThetermSWSRRcanbeusedforclarity
to denote SRRs that solely involve software
products.
SW
software
SWE
softwareengineering
TRR
testreadinessreview
TS
technicalspecification
17
ECSSQST80C
6March2009
4
Space system software product assurance
principles
4.1 Introduction
The objectives of software product assurance are to provide adequate
confidence to the customer and to the supplier that the developed or
procured/reused software satisfies its requirements throughout the system
lifetime. In particular, that the software is developed to perform properly and
safelyinitsoperationalenvironment,meetingthe qualityobjectivesagreedfor
theproject.
This Standardcontributes to these objectives by defining the software product
assurance requirements to be met in a particular space project. These
requirementsdealwithqualitymanagementandframework,lifecycleactivities
andprocessdefinitionandqualitycharacteristicsofproducts.
One of the fundamental principles of this Standard is the customersupplier
relationship, assumed for all software developments. The organizational
aspects of this are defined in ECSSMST10. The customer is, in the general
case, the procurer of two strongly associated products: the hardware and the
softwarecomponentsofasystem,subsystem,set,equipmentorassembly.The
concept of the customersupplier relationship is applied recursively, i.e. the
customer can himself be a supplier to a higher level in the space system
hierarchy.
The requirements of this Standard are applicable to the supplier, unless
otherwiseexplicitlystated.
The supplier demonstrates compliance with the software product assurance
requirementsandprovidesthespecifiedevidenceofcompliance.
Tothisend,thesupplierspecifiesthesoftwareproductassurancerequirements
for his/her suppliers, taking into account their responsibilities and the specific
natureoftheirdeliverables.
This Standard complements ECSSEST40 “Space engineering Software
generalrequirements”,withproductassuranceaspects,integratedinthe space
system software engineering processes as defined in ECSSEST40. Together
thetwostandardsspecifyallprocessesforspacesoftwaredevelopment.
Figure41schematicallypresentsthedifferentSoftwareprocessesaddressedby
thesetoftheECSSstandards.
18
ECSSQST80C
6March2009
Life cycle processes
Supporting life cycle processes
A
cquisition
Supply
Documentation
Configuration management
Developmen
t
Operation
Maintenance
Quality assurance
Verification
Validation
Joint review
A
udit
Problem resolution
Organizational life cycle processes
Management
Improvement
Infrastructure
Training
Other ECSS
ECSS-E-ST-40
ECSS-Q-ST-80
Details for SPA and/or SWE
Figure41:SoftwarerelatedprocessesinECSSStandards
4.2 Organization of this Standard
ThisStandardisorganizedintothreemainparts:
Softwareproductassuranceprogrammeimplementation
Softwareprocessassurance
Softwareproductqualityassurance.
The software documentation collecting the expected output of the ECSSEST
40andECSSQST80requirementsissummarizedin
AnnexA.
Annex B and Annex C specify the DRDs (document requirements definitions)
of the software product assurance documents (SPAP and SPAMR). The DRDs
of other software engineering and management documents are included in
ECSSEST40andECSSMST40.
In the preparation of this Standard the ISO/IEC12207 standard has been used
extensively,providingacommoninternationallyrecognizedframeworkforthe
terminologyandsoftwarelifecycleprocessesdescription.
TheorganizationofthisStandardisreflectedindetailin
Figure42.
19
ECSSQST80C
6March2009
Softwareproductassuranceprogrammeimplementation
5.1 Organizationandresponsibility 5.5 Procurement
5.2 Softwareproductassurance
programmemanagement
5.6 Toolsandsupporting
environment
5.3 Riskmanagementandcritical
itemcontrol
5.7 Assessmentandimprovement
process
5.4 Supplierselectionandcontrol
Softwareprocessassurance
6.1 Softwaredevelopmentlifecycle
6.2 Requirementsapplicable toallsoftware engineeringprocesses
6.3 Requirementsapplicable toindividualsoftwareengineeringprocesses
oractivities
Softwareproductqualityassurance
7.1 Productqualityobjectivesandmetrication
7.2 Productqualityrequirements
7.3 Softwareintendedforreuse
7.4 Standardgroundhardwareandservicesforoperationalsystem
7.5 Firmware
Figure42:StructureofthisStandard
EachrequirementofthisStandardisidentifiedbyahierarchicalnumber,plusa
letter if necessary (e.g. 5.3.1.5, bullet a). For each requirement, the associated
output is given in the “Expected Output” section. When several outputs are
expected, they are identified by a letter (e.g. “a”, “b”, etc.). With each output,
the destination file of the output is indicated in brackets, together with the
corresponding document DRD (after a comma) and review(s) (after a
semicolon). For example: [PAF, SPAP; SRR] denotes an output contained in
the Software Product Assurance Plan, part of the Product Assurance File, and
requiredatSRR.WhennoDRDis defined for an Expected Output,and/orthe
Expected Output is not to be provided at any specific milestone review, then
thecorrespondingsectionsofthatExpectedOutputarereplacedbydashes(e.g.
[PAF,‐;‐]”).
Thisstandardsdetailsfor the Software Product Assuranceaspectssomeof the
general requirements already addressed by the ECSS Management, Product
AssuranceandQualityAssurancestandards.
20
ECSSQST80C
6March2009
4.3 Tailoring of this Standard
The general information and requirements for the selection and tailoring of
applicablestandardsaredefinedinECSSSST00.
Thereareseveraldriversfortailoring,suchasdependabilityandsafetyaspects,
software development constraints, product quality objectives and business
objectives.
Tailoring for dependability and safety aspects is based on the selection of
requirements related to the verification, validation and levels of proofs
demandedbythecriticalityofthesoftware.
AnnexDcontainsatailoringofthis
Standardbasedonsoftwarecriticality.
Tailoring for software development constraints takes into account the special
characteristics of the software being developed, and of the development
environment.Thetypeofsoftwaredevelopment(e.g.databaseorrealtime)and
the target system (e.g. embedded processor, host system, programmable
devices, or applicationspecific integrated circuits) are also taken into account
(see Annex S of ECSSEST40). Specific requirements for verification, review
and inspection are imposed, for example, when full validation on the target
computerisnotfeasibleorwhereperformancegoalsaredifficulttoachieve.
Tailoring for product quality and business objectives is done by selecting
requirementsonqualityoftheproductasexplainedinclause
7ofthisStandard
basedonthequalityobjectivesfortheproductspecifiedbythecustomer.
21
ECSSQST80C
6March2009
5
Software product assurance programme
implementation
5.1 Organization and responsibility
5.1.1 Organization
a. The supplier shall ensure that an organizational structure is defined for
software development, and that individuals have defined tasks and
responsibilities.
5.1.2 Responsibility and authority
5.1.2.1
a. The responsibility, the authority and the interrelation of personnel who
manage, perform and verify work affecting software quality shall be
definedanddocumented.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR].
5.1.2.2
a. The responsibilities and the interfaces of each organisation, either
external or internal, involved in a project shall be defined and
documented.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR].
5.1.2.3
a. The delegation of software product assurance tasks by a supplier to a
lowerlevel suppliershallbe done in a documented and controlled way,
withthesupplierretainingtheresponsibilitytowardsthecustomer.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR].
22
ECSSQST80C
6March2009
5.1.3 Resources
5.1.3.1
a. The supplier shall provide adequate resources to perform the required
softwareproductassurancetasks.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR].
5.1.3.2
a. Reviewsandauditsof processesandofproducts shall be carried out by
personnelnotdirectlyinvolvedintheworkbeingperformed.
5.1.4 Software product assurance
manager/engineer
5.1.4.1
a. The supplier shall identify the personnel responsible for software
productassurancefortheproject(SWPAmanager/engineer).
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR].
5.1.4.2
a. Thesoftwareproductassurancemanager/engineershall
1. report to the project manager (through the project product
assurancemanager,ifany);
2. have organisational authority and independence to propose and
maintain a software product assurance programme in accordance
withtheprojectsoftwareproductassurancerequirements;
3. have unimpeded access to higher management as necessary to
fulfilhis/herduties.
5.1.5 Training
5.1.5.1
a. Thesuppliershallreviewtheprojectrequirementstoestablishandmake
timely provisionforacquiringordevelopingthe resources andskillsfor
themanagementandtechnicalstaff.
EXPECTED OUTPUT: Training plan [MGT, -; SRR].
5.1.5.2
a. Thesuppliershallmaintaintrainingrecords.
EXPECTED OUTPUT: Records of training and experience [PAF, -; -].
23
ECSSQST80C
6March2009
5.1.5.3
a. The supplier shall ensure that the right composition and categories of
appropriately trained personnel are available for the planned activities
andtasksinatimelymanner.
5.1.5.4
a. The supplier shall determine the training subjects based on the specific
tools, techniques, methodologies and computer resources to be used in
thedevelopmentandmanagementofthesoftwareproduct.
NOTE Personnelcanundergotrainingtoacquireskills
and knowledge relevant to the specific field
withwhichthesoftwareistodeal.
5.2 Software product assurance programme management
5.2.1 Software product assurance planning and
control
5.2.1.1
a. Thesuppliershalldevelopasoftwareproductassuranceplaninresponse
to the software product assurance requirements in conformance with
DRDinannexB.
b. The software product assurance plan shall be either a standalone
documentorasectionofthesupplieroverallproductassuranceplan.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
5.2.1.2
a. Any internal manuals, standards or procedures referred to by the
software product assurance plan shall become an integral part of the
supplier’ssoftwareproductassuranceprogramme.
5.2.1.3
a. The software product assurance plan shall be revisited and updated as
neededateachmilestonetoensurethattheactivitiestobeundertakenin
thefollowingphasearefullydefined.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; CDR, QR, AR, ORR].
5.2.1.4
a. Before acceptance review, the supplier shall either supplement the
software product assurance plan to specify the quality measures related
24
ECSSQST80C
6March2009
totheoperationsandmaintenanceprocesses,orissueaspecificsoftware
productassuranceplan.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; AR].
5.2.1.5
a. The supplier shall provide with the software product assurance plan a
compliance ma trix documenting conformance with the individual
software product assurance requirements applicable for the project or
businessagreement.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
b. Foreachsoftwareproductassurancerequirement,thecompliancematrix
shall provide a reference tothe document where the expected output of
thatrequirementislocated.
NOTE For compliance with the required DRDs a
generalstatementofcomplianceisacceptable.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
5.2.2 Software product assurance reporting
5.2.2.1
a. Thesuppliershall reportonaregularbasisonthestatus of the software
productassurance programme implementation,ifappropriateas part of
theoverallproductassurancereportingoftheproject.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
5.2.2.2
a. Thesoftwareproductassurancereportshallinclude:
1. anassessmentofthecurrentqualityofthe productandprocesses,
basedonmeasuredproperties,withreferencetothemetricationas
definedinthesoftwareproductassuranceplan;
2. verificationsundertaken;
3. problemsdetected;
4. problemsresolved.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
5.2.2.3
a. The supplier shall deliver at each milestone review a software product
assurance milestone report, covering the software product assurance
activitiesperformedduringthepastprojectphases.
EXPECTED OUTPUT: Software product assurance milestone report
[PAF, SPAMR; SRR, PDR, CDR, QR, AR,
ORR].
25
ECSSQST80C
6March2009
5.2.3 Audits
a. Forsoftwareaudits,ECSSQST10clause5.2.3shallapply.
EXPECTED OUTPUT: Audit plan and schedule [PAF, -; SRR].
5.2.4 Alerts
a. Forsoftwarealerts,ECSSQST10clause5.2.9shallapply.
EXPECTED OUTPUT: The following outputs are expected:
a. Preliminary alert information [PAF, -; -];
b. Alert information [PAF, -; -].
5.2.5 Software problems
5.2.5.1
a. The supplier shall define and implement procedures for the logging,
analysis and correction of all software problems encountered during
softwaredevelopment.
EXPECTED OUTPUT: Software problem reporting procedures
[PAF, -; PDR].
5.2.5.2
a. Thesoftwareproblemreportshallcontainthefollowinginformation:
1. identificationofthesoftwareitem;
2. descriptionoftheproblem;
3. recommendedsolution;
4. finaldisposition;
5. modificationsimplemented(e.g.documents,code,andtools);
6. testsreexecuted.
EXPECTED OUTPUT: Software problem reporting procedures
[PAF, -; PDR].
5.2.5.3
a. Theproceduresforsoftwareproblemsshalldefinetheinterfacewiththe
nonconformance system (i.e. the circumstances under which a problem
qualifiesasanonconformance).
EXPECTED OUTPUT: Software problem reporting procedures
[PAF, -; PDR].
5.2.5.4
a. The supplier shall ensure the correct application of problem reporting
procedures.
26
ECSSQST80C
6March2009
5.2.6 Nonconformances
5.2.6.1
a. Forsoftwarenonconformancehandling,ECSSQST1009shallapply
EXPECTED OUTPUT: The following outputs are expected:
a. NCR SW procedure as part of the Software
product assurance plan [PAF, SPAP; SRR];
b. Nonconformance reports [DJF, -; -].
b. When dealing with software nonconformance, the NRB shall include,at
least, a representative from the software product assurance and the
softwareengineeringorganizations.
EXPECTED OUTPUT: Identification of SW experts in NRB [MGT, -;
SRR]
5.2.6.2
a. The software product assurance plan shall specify the point in the
softwarelifecyclefromwhichthenonconformanceproceduresapply.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
5.2.7 Quality requirements and quality models
5.2.7.1
a. Quality models shall be used to specify the software quality
requirements.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
5.2.7.2
a. Thefollowingcharacteristicsshallbeusedtospecifythequalitymodel:
1. functionality;
2. reliability;
3. maintainability;
4. reusability;
5. suitabilityforsafety;
6. security;
7. usability;
8. efficiency;
9. portability;
10. softwaredevelopmenteffectiveness.
27
ECSSQST80C
6March2009
NOTE1 Quality models are the ba sis for the identification
of process metrics (see clause
6.2.5) and product
metrics(seeclause
7.1.4).
NOTE2 qualitymodelsarealsoaddressedbyISO/IEC9126
orECSSQHB8004.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
5.3 Risk management and critical item control
5.3.1 Risk management
a. Risk management for software shall be performed by crossreference to
theprojectriskpolicy,asspecifiedinECSSMST80.
5.3.2 Critical item control
5.3.2.1
a. Forcriticalitemcontrol,ECSSQST1004shallapply.
5.3.2.2
a. The supplier shall identify the characteristics of the software items that
qualifythemforinclusionintheCriticalItemList.
5.4 Supplier selection and control
5.4.1 Supplier selection
5.4.1.1
a. ForsupplierselectionECSSQST20clause5.4.1shallapply.
EXPECTED OUTPUT: The following outputs are expected:
a. Results of pre-award audits and assessments
[PAF, -; -];
b. Records of procurement sources [PAF, -; -].
5.4.1.2
a. For the selection of suppliers of existing software, including software
contained in OTS equipments and units, the expected output of clauses
6.2.7.2to6.2.7.6shallbemadeavailable.
EXPECTED OUTPUT: Software reuse file [DJF, SRF; -].
28
ECSSQST80C
6March2009
5.4.2 Supplier requirements
5.4.2.1
a. Thesuppliershallestablishsoftwareproductassurancerequirementsfor
the next level suppliers, tailored to their role in the project, including a
requirementtoproduceasoftwareproductassuranceplan.
EXPECTED OUTPUT: Software product assurance requirements for
suppliers [PAF, -; SRR].
5.4.2.2
a. Thesuppliershall provide the software product assurance requirements
applicabletothenextlevelsuppliersforcustomer’sacceptance.
EXPECTED OUTPUT: Software product assurance requirements for
suppliers [PAF, -; SRR].
5.4.3 Supplier monitoring
5.4.3.1
a. Thesuppliershallmonitorthenextlowerlevelsuppliers’conformanceto
theproductassurancerequirements.
5.4.3.2
a. Themonitoringprocessshallincludethereviewandapprovalofthenext
lower level suppliers’ product assurance plans, the continuous
verification of processes and products, and the monitoring of the final
validationoftheproduct.
5.4.3.3
a. The supplier shall ensure that software development processes are
defined and applied by the next lower level suppliers in conformance
withthesoftwareproductassurancerequirementsforsuppliers.
EXPECTED OUTPUT: Next level suppliers’ software product
assurance plan [PAF, SPAP; PDR].
5.4.3.4
a. The supplier shall provide the next lower level suppliers’ software
productassuranceplanforcustomer’sacceptance.
EXPECTED OUTPUT: Next level suppliers’ software product
assurance plan [PAF, SPAP; PDR].
29
ECSSQST80C
6March2009
5.4.4 Criticality classification
a. The supplier shall provide the lower level suppliers with the relevant
resultsofthesafetyanddependabilityanalysesperformedathigherand
hislevel(ref.clauses
6.2.2.1and6.2.2.2),including:
1. the criticality classification of the software products to be
developed;
2. informationaboutthefailuresthatcanbecausedathigherlevelby
thesoftwareproductstobedeveloped.
EXPECTED OUTPUT: Safety and dependability analyses results for
lower level suppliers [RB, -; SRR].
5.5 Procurement
5.5.1 Procurement documents
a. Forprocurementdocuments,ECSSQST20clause5.4.2shallapply.
5.5.2 Review of procured software component list
a. The choice of procured software shall be described and submitted for
customerreview.
EXPECTED OUTPUT: Software development plan [MGT, SDP;
SRR, PDR].
5.5.3 Procurement details
a. Foreachofthesoftwareitemsthefollowingdatashallbeprovided:
1. orderingcriteria
NOTE Forexample:versions,optionsandextensions.
2. receivinginspectioncriteria;
3. backupsolutionsiftheproductbecomesunavailable;
4. contractual arrangements with the supplier for the development,
maintenanceandupgradestonewreleases.
EXPECTED OUTPUT: Procurement data [MGT, -; SRR, PDR].
5.5.4 Identification
a. All the procured software shall be identified and registered by
configurationmanagement.
5.5.5 Inspection
a. The supplier shall subject the procured software to a planned receiving
inspection, in accordance with ECSSQST20 clause 5.4.4, and the
receivinginspectioncriteriaasrequiredbyclause
5.5.3.
EXPECTED OUTPUT: Receiving inspection report [PAF, -; PDR,
CDR, QR].
30
ECSSQST80C
6March2009
5.5.6 Exportability
a. Exportabilityconstraintsshallbeidentified.
5.6 Tools and supporting environment
5.6.1 Methods and tools
5.6.1.1
a. Methods and tools to be used for all the activities of the development
cycle,(including requirementsanalysis,softwarespecification,
modelling, design, coding, validation, testing, configuration
management, verification and product assurance) shall be identified by
thesupplierandagreedbythecustomer.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
5.6.1.2
a. The choice of development methods and tools shall be justified by
demonstratingthroughtestingordocumentedassessmentthat:
1. the development team has appropriate experience or training to
applythem,
2. the tools and methods are appropriate for the functional and
operationalcharacteristicsoftheproduct,and
3. the tools are available (in an appropriate hardware environment)
throughout the development and maintenance lifetime of the
product.
EXPECTED OUTPUT: Software product assurance milestone report
[PAF, SPAMR; SRR, PDR].
5.6.1.3
a. Thecorrectuseofmethodsandtoolsshallbeverifiedandreported.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
5.6.2 Development environment selection
5.6.2.1
a. The software development environment shall be selected according to
thefollowingcriteria:
1. availability;
2. compatibility;
31
ECSSQST80C
6March2009
3. performance;
4. maintenance;
5. durability and technical consistency with the operational
equipment;
6. the assessment of the product with respect to requirements,
includingthecriticalitycategory;
7. theavailablesupportdocumentation;
8. theacceptanceandwarrantyconditions;
9. theconditionsofinstallation,preparation,traininganduse;
10. the maintenance conditions, including the possibilities of
evolutions;
11. copyrightandintellectualpropertyrightsconstraints;
12. dependenceononespecificsupplier.
EXPECTED OUTPUT: Software development plan [MGT, SDP;
SRR, PDR].
5.6.2.2
a. The suitability of the software development environment shall be
justified.
EXPECTED OUTPUT: Software development plan [MGT, SDP;
SRR, PDR].
5.6.2.3
a. Theavailabilityofthesoftwaredevelopmentenvironmenttodevelopers
and other users shall be verified before the start of each development
phase.
5.7 Assessment and improvement process
5.7.1 Process assessment
a. Thesuppliershallmonitorand control theeffectivenessof the processes
used during the development of the software, including the relevant
processescorrespondingtotheservicescalled from other organizational
entitiesoutsidetheprojectteam.
NOTE The process assessment and improvement
performed at organization level can be used to
provideevidenceofcompliancefortheproject.
EXPECTED OUTPUT: Software process assessment records:
Overall assessments and improvement
programme plan [PAF, -; -].
32
ECSSQST80C
6March2009
5.7.2 Assessment process
5.7.2.1
a. Theprocessassessmentmodelandmethodtobeusedwhenperforming
anysoftwareprocessassessmentshallbedocumented.
EXPECTED OUTPUT: The following outputs are expected:
a. Software process assessment record: assessment
model [PAF, -; -];
b. Software process assessment record: assessment
method [PAF, -; -].
5.7.2.2
a. Assessmentsperformed andprocessassessmentmodelsusedshall be in
conformancewithISO/IEC15504(Part2).
EXPECTED OUTPUT: The following outputs are expected:
a. Software process assessment record: evidence of
conformance of the process assessment model
[PAF, -; -];
b. Software process assessment record: assessment
method [PAF, -; -].
NOTE1 The model and method documented in ECSSQ
HB8002areconformanttoISO/IEC15504(Part2).
NOTE2 CurrentlytheCMMImodelisnotfullyconformant
to ISO/IEC 15504, however it can be used,
providedthattheSCAMPIAmethodisapplied.
5.7.2.3
a. The process assessment model, the method, the assessment scope, the
results and the assessors shall be verified as complying with the project
requirements.
NOTE1 Examplesofassessmentscopesare:organizational
unitevaluated,andprocessesevaluated.
NOTE2 ECSSQHB8002 provides space specific process
referencemodelandtheirindicators.
EXPECTED OUTPUT: Software process assessment record:
Software process assessment recognition
evidence [PAF, -; -].
5.7.2.4
a. Assessments, carried out in accordance with ECSSQHB8002, shall be
performed by a competent assessor, whereas the other assessment team
memberscanbeeithercompetentassessororprovisionalassessor.
NOTE1 For other assessment schemes conformant to
ISO/IEC 15504 (Part 2), assessors certified under
INTRSAarecompetentassessors.
NOTE2 When using CMMI/SCAMPI A, SEI authorized
leadappraisersarecompetentassessors.
EXPECTED OUTPUT: Software process assessment record:
competent assessor justification [PAF, -; -].
33
ECSSQST80C
6March2009
5.7.3 Process improvement
5.7.3.1
a. The results of the assessment shall be used as feedback to improve as
necessary the performed processes, to recommend changes in the
directionoftheproject,andtodeterminetechnologyadvancementneeds.
b. The suppliers shall ensure that the results of previous assessments are
usedinitsprojectactivity
EXPECTED OUTPUT: Software process assessment records:
improvement plan [PAF, -; -].
5.7.3.2
a. Theprocessimprovementshallbeconductedaccordingtoadocumented
processimprovementprocess.
NOTE1 For the definition of the process improvement
process,seeECSSQHB8002.
NOTE2 For CMMI, the process improvement is described
intheOPF(OrganizationalProcessFocus)process
area.
EXPECTED OUTPUT: Software process assessment records:
improvement process [PAF, -; -].
5.7.3.3
a. Evidence of the improvement in performed processes or in project
documentationshallbeprovided.
NOTE SeeECSSQHB8002.
EXPECTED OUTPUT: Software process assessment records:
evidence of improvements [PAF, -; -].
34
ECSSQST80C
6March2009
6
Software process assurance
6.1 Software development life cycle
6.1.1 Life cycle definition
a. Thesoftwaredevelopmentlifecycleshallbedefinedorreferencedinthe
softwareproductassuranceplan.
b. Thefollowingcharacteristicsofthesoftwarelifecycleshallbedefined:
1. phases;
2. inputandoutputofeachphase;
3. statusofcompletionofphaseoutput;
4. milestones;
5. dependencies;
6. responsibilities;
7. roleofthecustomerateachmilestonereview,inconformancewith
ECSSMST10andECSSMST1001.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
6.1.2 Process quality objectives
a. In the definition of the life cycle and associated milestones and
documents,thequalityobjectivesshallbeused.
6.1.3 Life cycle definition review
a. Thesoftwarelifecycleshallbereviewedagainstthecontractualsoftware
engineeringandproductassurancerequirements.
6.1.4 Life cycle resources
a. The software life cycle shall be reviewed for suitability and for the
availability of resources to implement it by all functions involved in its
application.
35
ECSSQST80C
6March2009
6.1.5 Software validation process schedule
a. Amilestone(SWTRRasdefinedinECSSEST40clause 5.3.5.1)shallbe
scheduled immediately before the software validation process starts, to
checkthat:
1. the software status is compatible with the commencement of
validationactivities;
2. thenecessaryresources,softwareproductassuranceplans,testand
validationdocumentation,simulatorsorothertechnicalmeansare
availableandreadyforuse.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
6.2 Requirements applicable to all software engineering
processes
6.2.1 Documentation of processes
6.2.1.1
a. Thefollowingactivitiesshallbecoveredeitherinsoftwarespecificplans
orinprojectgeneralplans:
1. development;
2. specification,designandcustomerdocumentstobeproduced;
3. configurationanddocumentationmanagement;
4. verification,testingandvalidationactivities;
5. maintenance.
EXPECTED OUTPUT: Software project plans [MGT, MF, DJF].
6.2.1.2
a. Allplansshallbefinalizedbeforethestartoftherelatedactivities.
EXPECTED OUTPUT: Software project plans [MGT, MF, DJF].
6.2.1.3
a. All plans shall be updated for each milestone to reflect any changes
duringdevelopment.
EXPECTED OUTPUT: Software project plans [MGT, MF, DJF].
6.2.1.4
a. The software product assurance plan shall identify all plans to be
produced and used, the relationship between them and the timescales
fortheirpreparationandupdate.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
36
ECSSQST80C
6March2009
6.2.1.5
a. Each plan shall be reviewed against the relevant contractual
requirements.
6.2.1.6
a. Procedures and project standards shall address all types of software
productsincludedintheproject.
EXPECTED OUTPUT: Procedures and standards [PAF, -; PDR].
6.2.1.7
a. Allproceduresandprojectstandardsshallbefinalizedbeforestartingthe
relatedactivities.
EXPECTED OUTPUT: Procedures and standards [PAF, -; PDR].
6.2.1.8
a. Eachprocedureorstandardshall bereviewedagainstthe relevantplans
andcontractualrequirements.
6.2.1.9
a. Beforeanyactivityisstarted,eachprocedureorstandardforthatactivity
shall be reviewed by all functions involved in its application, for
suitabilityandfortheavailabilityofresourcestoimplementit.
6.2.2 Software dependability and safety
6.2.2.1
a. For the systemlevel analyses leading to the criticality classification of
softwareproducts based on the severity of failuresconsequences,ECSS
QST40Table61,andECSSQST30Table51,shallapply.
EXPECTED OUTPUT: Criticality classification of software products
[PAF, -; SRR, PDR].
6.2.2.2
a. The supplier shall perform a softwaredependabilityand safety analysis
of the software products, in accordance with the requirements ofECSS
QST30 and ECSSQST40 and using the results of systemlevel safety
and dependability analyses, in order to determine the criticality of the
individualsoftwarecomponents.
EXPECTED OUTPUT: Software dependability and safety analysis
report [PAF, -; PDR].
37
ECSSQST80C
6March2009
6.2.2.3
a. The supplier shall identify the methods and techniques for the software
dependability and safety analysis to be performed at technical
specificationanddesignlevel.
b. Methods and techniques for software dependability and safety analysis
shallbeagreedbetweenthesupplierandcustomer.
NOTE ECSSQHB8003providesindicationon
methods and techniques that can be applied
suchas:
software failure modes, effects and
criticalityanalysis(fortheperformingofthis
analysis,seealsoECSSQST3002);
softwarefaulttreeanalysis;
softwarecommoncausefailureanalysis.
EXPECTED OUTPUT: Criticality classification of software
components [PAF, -; PDR].
6.2.2.4
a. Basedontheresultsofthesoftwarecriticalityanalysis,thesuppliershall
apply engineering measures to reduce the number of critical software
components and mitigate the risks associated with the critical software
(ref.clause
6.2.3).
6.2.2.5
a. The supplier shall report on the status of the implementation and
verification of the SW dependability and safety analysis
recommendations.
EXPECTED OUTPUT: Software dependability and safety analysis
report [PAF, -; CDR, QR, AR].
6.2.2.6
a. Thesuppliershallupdatethesoftwaredependabilityandsafetyanalysis
at each software development milestone, to confirm the criticality
categoryofsoftwarecomponents.
EXPECTED OUTPUT: Software dependability and safety analysis
report [PAF, -; CDR, QR, AR].
6.2.2.7
a. The supplier shall provide the results of the software dependability and
safety analysis for integration into the systemlevel dependability and
safetyanalyses,addressinginparticular:
1. additionalfailuremodesidentifiedatsoftwaredesignlevel;
2. recommendationsforsystemlevelactivities.
38
ECSSQST80C
6March2009
NOTE Forexample:introductionofhardwareinhibits,
andmodificationsofthesystemarchitecture.
EXPECTED OUTPUT: Software dependability and safety analysis
report [PAF, -; PDR, CDR].
6.2.2.8
a. Aspartofthesoftwarerequirementsanalysisactivities(ref.clause6.3.2),
the supplier shall contribute to the HardwareSoftware Interaction
Analysis(HSIA)byidentifying,foreachhardwarefailureincludedinthe
HSIA,therequirements thatspecifythe software behaviourintheevent
ofthathardwarefailure.
6.2.2.9
a. During the verification and validation of the software requirements
resultingfrom the HardwareSoftwareInteractionAnalysis,thesupplier
shallverifythatthesoftwarereactscorrectlytohardwarefailures,andno
undesiredsoftwaremalfunctionsoccurthatmayleadtosystemfailures.
6.2.3 Handling of critical software
6.2.3.1
a. Thesuppliershalldefineand implementmeasuresto avoidpropagation
offailuresbetweensoftwarecomponentsofdifferentcriticality.
NOTE This can be achieved by design measures such
as separate hardware platforms, isolation of
software processes or prohibition of shared
memory(segregationandpartitioning).
b. Software components whose malfunction may cause failures of higher
criticality components shall be classified in accordance with the
consequencesofthosefailures.
EXPECTED OUTPUT: The following outputs are expected:
a. Software product assurance plan [PAF, SPAP;
PDR, CDR];
b. Software dependability and safety analysis
report [PAF, -; PDR, CDR, QR, AR].
6.2.3.2
a. The supplier shall define, justify and apply measures to assure the
dependabilityandsafetyofcriticalsoftware.
NOTE Thesemeasurescaninclude:
useofsoftwaredesignormethodsthathave
performed successfully in a similar
application;
insertionoffeaturesforfailureisolationand
handling (ref. ECSSQHB8003, software
39
ECSSQST80C
6March2009
failuremodes, effects and criticality
analysis);
defensive programming techniques, such as
inputverificationandconsistencychecks;
use of a “safe subset” of programming
language;
use of formal design language for formal
proof;
100% code branch coverage at unit testing
level;
fullinspectionofsourcecode;
witnessedorindependenttesting;
gatheringandanalysisoffailurestatistics;
removing deactivated code or showing
through a combination of analysis and
testing that the means by which such code
can be inadvertently executed are
prevented,isolated,oreliminated.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR, CDR].
6.2.3.3
a. The application of the chosen measures to handle the critical software
shallbeverified.
EXPECTED OUTPUT: Software product assurance milestone report
[PAF, SPAMR; PDR, CDR, QR, AR].
6.2.3.4
a. Criticalsoftwareshallbesubjecttoregressiontestingafter:
1. anychangeoffunctionalityoftheunderlyingplatformhardware;
NOTE Forexample:instructionsetofaprocessor.
2. any change of the tools that affect directly or indirectly the
generationoftheexecutablecode.
NOTE Incaseofminorchangesintoolsthataffectthe
generation of the executable code, a binary
comparisonoftheexecutablecodegeneratedby
thedifferenttoolscanbeused toverifythatno
modificationsareintroduced.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR, CDR].
40
ECSSQST80C
6March2009
6.2.3.5
a. The need for additional verification and validation of critical software
shallbeanalysedafter:
1. any change of functionality or performance of the underlying
platformhardware;
2. any change in the environment in which the software or the
platformhardwareoperate.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR, CDR].
6.2.3.6
a. Identified unreachable code shall be removed and the need for re
verificationandrevalidationshallbeanalysed.
6.2.3.7
a. Unit and integration testing shall be (re)executed on noninstrumented
code.
6.2.3.8
a. Validationtestingshallbe(re)executedonnoninstrumentedcode.
6.2.4 Software configuration management
6.2.4.1
a. ECSSMST40 shall be applied for software configuration management,
complementedbythefollowingrequirements.
6.2.4.2
a. The software configuration management system shall allow any
referenceversiontoberegeneratedfrombackups.
EXPECTED OUTPUT: Software configuration management plan
[MGT, SCMP; SRR, PDR].
6.2.4.3
a. The software configuration file and the software release document shall
beprovidedwitheachsoftwaredelivery.
EXPECTED OUTPUT: The following outputs are expected:
a. Software configuration file [DDF, SCF; -];
b. Software release document [DDF, SRelD; -].
41
ECSSQST80C
6March2009
6.2.4.4
a. Thesoftwareconfigurationfileshallbeavailableanduptodateforeach
projectmilestone.
EXPECTED OUTPUT: Software configuration file [DDF, SCF;
CDR, QR, AR, ORR].
6.2.4.5
a. Anycomponentsofthecodegenerationtoolthatarecustomizablebythe
usershallbeputunderconfigurationcontrol.
b. The change control procedures defined for the project shall address the
specificaspectsofthesecomponents.
EXPECTED OUTPUT: The following outputs are expected:
a. Software configuration file [DDF, SCF; CDR,
QR, AR, ORR];
b. Software configuration management plan [MGT,
SCMP; SRR, PDR].
6.2.4.6
a. Thesuppliershallensurethatallauthorizedchangesareimplementedin
accordancewiththesoftwareconfigurationmanagementplan.
EXPECTED OUTPUT: Authorized changes - Software configuration
file [DDF, SCF; CDR, QR, AR, ORR].
6.2.4.7
a. The following documents shall be controlled (see ECSSQST10 clause
5.2.5):
1. proceduraldocumentsdescribingthequalitysystemtobeapplied
duringthesoftwarelifecycle;
2. planning documents describing the planning and progress of the
activities;
3. documentsdescribingaparticularsoftwareproduct,including:
(a) developmentphaseinputs,
(b) developmentphaseoutputs,
(c) verificationandvalidationplansandresults,
(d) testcasespecifications,testproceduresandtestreports,
(e) traceabilitymatrices,
(f) documentation for the software and system operators and
users,and
(g) maintenancedocumentation.
42
ECSSQST80C
6March2009
6.2.4.8
a. The supplier shall identify a method and tool to protect the supplied
softwareagainstcorruption.
NOTE Forexample:source,executableanddata.
EXPECTED OUTPUT: The following outputs are expected:
a. Software product assurance plan [PAF, SPAP;
SRR, PDR];
b. Software configuration file [DDF, SCF; CDR,
QR, AR, ORR].
6.2.4.9
a. The supplier shall define a checksumtype key calculation for the
deliveredoperationalsoftware.
NOTE Forexample:executablebinary,database.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
6.2.4.10
a. The checksum valueshallbe provided in thesoftwareconfiguration file
witheachsoftwaredelivery.
EXPECTED OUTPUT: Software configuration file [DDF, SCF; -].
6.2.4.11
a. Themediathroughwhichthesoftwareisdeliveredtothecustomershall
be marked by the supplier indicating the following information as a
minimum:
1. thesoftwarename;
2. theversionnumber;
3. thereferencetothesoftwareconfigurationfile.
EXPECTED OUTPUT: The following outputs are expected:
a. Software product assurance plan [PAF, SPAP;
SRR, PDR];
b. Labels [DDF, -; -].
6.2.5 Process metrics
6.2.5.1
a. Metrics shall be used to manage the development and to assess the
qualityofthedevelopmentprocesses.
NOTE Process metrics are based on quality models
(seeclause
5.2.7).
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
43
ECSSQST80C
6March2009
6.2.5.2
a. Processmetricsshallbecollected,storedandanalysedonaregularbasis
byapplyingqualitymodelsandprocedures.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
6.2.5.3
a. The following basic process metrics shall be used within the supplier’s
organization:
1. duration: how phases and tasks are being completed versus the
plannedschedule;
2. effort: how much effort is consumed by the various phases and
taskscomparedtotheplan.
EXPECTED OUTPUT: Internal metrics report.
6.2.5.4
a. Process metrics shall be used within the supplier’s organization and
reportedtothecustomer,including:
1. numberofproblemsdetectedduringverification;
2. number of problems detected during integration and validation
testinganduse.
NOTE See also software problem reporting described
inclause
5.2.5.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.2.5.5
a. Metrics reports shall be included in the software product assurance
reports.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.2.6 Verification
6.2.6.1
a. Activities for the verification of the quality requirements shall be
specifiedinthedefinitionoftheverificationplan.
NOTE Verificationincludesvarioustechniquessuchas
review, inspection, testing, walkthrough,
crossreading, deskchecking, model
simulation,and many types ofanalysissuchas
traceability analysis, formal proof or fault tree
analysis.
EXPECTED OUTPUT: Software verification plan [DJF, SVerP;
PDR].
44
ECSSQST80C
6March2009
6.2.6.2
a. The outputs of each development activity shall be verified for
conformanceagainstpredefinedcriteria.
b. Onlyoutputswhichhavebeensubjectedtoplannedverificationsshallbe
usedasinputsforsubsequentactivities.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.2.6.3
a. Asummaryoftheassuranceactivitiesconcerningtheverificationprocess
and their findings shall be included in software product assurance
reports.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.2.6.4
a. Thecompletion ofactionsrelatedtosoftwareproblemreportsgenerated
duringverificationshallbeverifiedandrecorded.
EXPECTED OUTPUT: Software problem reports [DJF, -; SRR,
PDR, CDR, QR, AR, ORR].
6.2.6.5
a. Software containing deactivated code shall be verified specifically to
ensurethatthedeactivatedcodecannotbeactivatedorthatitsaccidental
activationcannotharmtheoperationofthesystem.
EXPECTED OUTPUT: Software verification report [DJF, SVR;
CDR, QR, AR].
6.2.6.6
a. Software containing configurable code shall be verified specifically to
ensurethatanyunintendedconfigurationcannotbeactivatedatruntime
orincludedduringcodegeneration.
EXPECTED OUTPUT: Software verification report [DJF, SVR;
CDR, QR, AR].
6.2.6.7
a. Thesuppliershallensurethat:
1. theplannedverificationactivitiesareadequatetoconfirmthat the
products of each phase are conformant to the applicable
requirements;
2. theverificationactivitiesareperformedaccordingtotheplan.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
45
ECSSQST80C
6March2009
6.2.6.8
a. Reviews and inspections shall be carried out according to defined
criteria, and according to the defined level of independence of the
reviewerfromtheauthoroftherevieweditem.
6.2.6.9
a. Each review and inspection shall be based on a written plan or
procedure.
NOTE For projects reviews, ECSSEST40 clause
5.3.3.3,bulletbandAnnexPareapplicable.
EXPECTED OUTPUT: Review and inspection plans or procedures
[PAF, -; -].
6.2.6.10
a. Therevieworinspectionplansorproceduresshallspecify:
1. thereviewedorinspecteditems;
2. thepersonincharge;
3. theparticipants;
4. themeansofrevieworinspection(e.g.toolsorchecklist);
5. thenatureofthereport.
EXPECTED OUTPUT: Review and inspection plans or procedures
[PAF, -; -].
6.2.6.11
a. Reviewandinspectionreportsshall:
1. refertothecorrespondingreview/inspectionprocedureorplan;
2. identify the reviewed item, the author, the reviewer, the review
criteriaandthefindingsofthereview.
EXPECTED OUTPUT: Review and inspection reports [PAF, -; -].
6.2.6.12
a. Traceability matrices (as defined in ECSSEST40 clause 5.8) shall be
verifiedateachmilestone.
EXPECTED OUTPUT: Software product assurance milestone report
[PAF, SPAMR; SRR, PDR, CDR, QR, AR,
ORR].
6.2.6.13
a. Independentsoftwareverificationshallbeperformedbyathirdparty.
b. Independent software verification shall be a combination of reviews,
inspections,analyses,simulations,testingandauditing.
46
ECSSQST80C
6March2009
NOTE This requirement is applicable where the risks
associated with the project justify the costs
involved. The customer can consider a less
rigorous level of independence, e.g. an
independentteaminthesameorganization.
EXPECTED OUTPUT: The following outputs are expected:
a. ISVV plan [DJF, -; SRR, PDR];
b. ISVV report [DJF, -; PDR, CDR, QR, AR].
6.2.7 Reuse of existing software
6.2.7.1 General
The requirements in 6.2.7 do not apply to tools and software development
environment,forwhichrequirementsofclause
5.6apply.
6.2.7.2
a. Analyses of the advantages to be obtained with the selection of existing
software(ref.
3.2.11)insteadofnewdevelopmentshallbecarriedout.
EXPECTED OUTPUT: The following outputs are expected:
a. Software reuse approach, including approach to
delta qualification [PAF, SPAP; SRR, PDR];
b. Software reuse file [DJF, SRF; SRR, PDR].
6.2.7.3
a. The existing software shall be assessed with regards to the applicable
functional,performanceandqualityrequirements.
EXPECTED OUTPUT: The following outputs are expected:
a. Software reuse approach, including approach to
delta qualification [PAF, SPAP; SRR, PDR];
b. Software reuse file [DJF, SRF; SRR, PDR].
6.2.7.4
a. The qualitylevel of the existing software shall be analysed with respect
to the project requirements, according to the criticality of the system
functionimplemented,takingintoaccountthefollowingaspects:
1. softwarerequirementsdocumentation;
2. softwarearchitecturalanddetaileddesigndocumentation;
3. forward and backward traceability between system requirements,
softwarerequirements,designandcode;
4. unittestsdocumentationandcoverage;
5. integrationtestsdocumentationandcoverage;
6. validationdocumentationandcoverage;
7. verificationreports;
8. performance;
47
ECSSQST80C
6March2009
NOTE Forexample:memoryoccupation,CPUload.
9. operationalperformances;
10. residualnonconformancesandwaivers;
11. useroperationaldocumentation;
NOTE Forexample:usermanual.
12. codequality(adherencetocodingstandards,metrics).
EXPECTED OUTPUT: The following outputs are expected:
a. Software reuse approach, including approach to
delta qualification [PAF, SPAP; SRR, PDR];
b. Software reuse file [DJF, SRF; SRR, PDR].
6.2.7.5
a. The results of the reused software analysis shall be recorded in the
software reuse file, together with an assessment of the possible level of
reuse and a description of the assumptions and the methods applied
whenestimatingthelevelofreuse.
NOTE Resultsofthereusedsoftwareanalysis,suchas
detailed reference to requirement and design
documents,testreportsandcoverageresults.
EXPECTED OUTPUT: The following outputs are expected:
a. Software reuse approach, including approach to
delta qualification [PAF, SPAP; SRR, PDR];
b. Software reuse file [DJF, SRF; SRR, PDR].
6.2.7.6
a. The analysis of the suitability of existing software for reuse shall be
complementedbyanassessmentofthefollowingaspects:
1. theacceptanceandwarrantyconditions;
2. theavailablesupportdocumentation;
3. theconditionsofinstallation,preparation,traininganduse;
4. theidentificationandregistrationbyconfigurationmanagement;
5. maintenance responsibility and conditions, including the
possibilitiesofchanges;
6. thedurabilityandvalidityofmethodsandtoolsusedintheinitial
development,thatareenvisagedtobeusedagain;
7. the copyright and intellectual property rights constraints
(modificationrights);
8. thelicensingconditions;
9. exportabilityconstraints.
EXPECTED OUTPUT: Software reuse file [DJF, SRF; SRR, PDR].
48
ECSSQST80C
6March2009
6.2.7.7
a. Corrective actions shall be identified, documented in the reuse file and
applied to the reused software not meeting the applicable requirements
relatedtotheaspectsasspecifiedinclauses
6.2.7.2to6.2.7.6.
EXPECTED OUTPUT: Software reuse file [DJF, SRF; SRR, PDR].
6.2.7.8
a. Reverse engineering techniques shall be applied to generate missing
documentation and to reach the required verification and validation
coverage.
b. Forsoftwareproducts whose life cycle data from previousdevelopment
are not available and reverse engineering techniques are not fully
applicable,thefollowingmethodsshallbeapplied:
1. generation of validation and verification documents based on the
available user documentation (e.g. user manual) and execution of
testsinordertoachievetherequiredleveloftestcoverage;
2. use of the product service history to provide evidence of the
product’s suitability for the current application, including
informationabout:
(a) relevance of the product service history for the new
operationalenvironment;
(b) configuration management and change control of the
softwareproduct;
(c) effectivenessofproblemreporting;
(d) actualerrorratesandmaintenancerecords;
(e) impactofmodifications.
EXPECTED OUTPUT: Software reuse file [DJF, SRF; SRR, PDR].
6.2.7.9
a. The software reuse file shall be updated at project milestones to reflect
the results of the identified corrective actions for reused software not
meetingtheprojectrequirements.
EXPECTED OUTPUT: Software reuse file [DJF, SRF; CDR, QR,
AR].
6.2.7.10
a. Allthereusedsoftwareshallbekeptunderconfigurationcontrol.
6.2.7.11
a. Thedetailedconfigurationstatusofthereusedsoftwarebaselineshallbe
providedtothecustomerinthereusefileforacceptance.
EXPECTED OUTPUT: Software reuse file [DJF, SRF; SRR, PDR,
CDR, QR, AR].
49
ECSSQST80C
6March2009
6.2.8 Automatic code generation
6.2.8.1
a. Fortheselectionoftoolsforautomaticcodegeneration,thesuppliershall
evaluatethefollowingaspects:
1. evolutionofthetoolsinrelationtothetoolsthatusethegenerated
codeasaninput;
NOTE For example: compilers or code management
systems.
2. customizationofthetoolstocomplywithprojectstandards;
3. portabilityrequirementsforthegeneratedcode;
4. collectionoftherequireddesignandcodemetrics;
5. verificationofsoftwarecomponentscontaininggeneratedcode;
6. configuration control of the tools including the parameters for
customisation;
7. compliancewithopenstandards.
6.2.8.2
a. The requirements on testing applicable to the automatically generated
code shall ensure the achievement of the same objectives as those for
manuallygeneratedcode.
EXPECTED OUTPUT: Validation and testing documentation [DJF,
SValP; PDR], [DJF, SVS; CDR, QR, AR],
[DJF, SUITP; PDR, CDR].
6.2.8.3
a. The required level of verification and validation of the automatic
generation tool shall be at least the same as the one required for the
generatedcode,ifthetoolisusedtoskipverificationortestingactivities
onthetargetcode.
6.2.8.4
a. Modellingstandardsforautomaticcodegenerationtoolsshallbedefined
andapplied.
EXPECTED OUTPUT: Modelling standards [PAF, -; SRR, PDR].
6.2.8.5
a. Adherencetomodellingstandardsshallbeverified.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
50
ECSSQST80C
6March2009
6.2.8.6
a. Clause 6.3.4 shall apply to automatically generated code, unless the
supplier demonstrates that the automatically generated code does not
needtobemanuallymodified.
6.2.8.7
a. The verification and validation documentation shall address separately
the activitiesto be performed for manuallyand automaticallygenerated
code.
EXPECTED OUTPUT: Validation and testing documentation [DJF,
SValP; PDR], [DJF, SVS; CDR, QR, AR],
[DJF, SUITP; PDR, CDR].
6.3 Requirements applicable to individual software
engineering processes or activities
6.3.1 Software related system requirements
process
6.3.1.1
a. For the definition of the software related system requirements to be
specified in the requirements baseline, ECSSEST40 clause 5.2 shall
apply.
6.3.1.2
a. Therequirementsbaselineshallbesubjecttodocumentationcontroland
configurationmanagementaspartofthedevelopmentdocumentation.
6.3.1.3
a. Forthedefinitionoftherequirementsbaseline,allresultsfromthesafetyand
depend abil it y analys es (including results from the HSIA ECSSQST30
clause6.4.2.3)shallbeused.
6.3.2 Software requirements analysis
6.3.2.1
a. Therequirementsbaselineshallbeanalyzedtofullyandunambiguously
definethesoftwarerequirementsinthetechnicalspecification.
6.3.2.2
a. Thetechnicalspecificationshallbesubjecttodocumentationcontroland
configurationmanagementaspartofthedevelopmentdocumentation.
51
ECSSQST80C
6March2009
6.3.2.3
a. Forthedefinit ionofthetechnicalspecification,allresultsfromthesafetyand
depend abil it y analys es (including results from the HSIA ECSSQST30
clause6.4.2.3)shallbeused.
6.3.2.4
a. In addition to the functional requirements, the technical specification
shall include all nonfunctional requirements necessary to satisfy the
requirementsbaseline,including,asaminimum,thefollowing:
1. performance,
2. safety,
3. reliability,
4. robustness,
5. quality,
6. maintainability,
7. configurationmanagement,
8. security,
9. privacy,
10. metrication,and
11. verificationandvalidation.
NOTE Performance requirements include
requirementsonnumericalaccuracy.
EXPECTED OUTPUT: Software requirements specification [TS,
SRS; PDR].
6.3.2.5
a. Prior to the technical specification elaboration, customer and supplier
shallagreeonthefollowingprinciplesandrulesasaminimum:
1. assignmentof persons (on both sides) responsible forestablishing
thetechnicalspecification;
2. methodsforagreeingonrequirementsandapprovingchanges;
3. efforts to prevent misunderstandings such as definition of terms,
explanationsofbackgroundofrequirements;
4. recordingandreviewingdiscussionresultsonbothsides.
52
ECSSQST80C
6March2009
6.3.3 Software architectural design and design of
software items
6.3.3.1
a. The design definition file shall be subject to documentation control and
configurationmanagement.
6.3.3.2
a. Mandatoryandadvisorydesignstandardsshallbedefinedandapplied.
EXPECTED OUTPUT: Design standards [PAF, -; SRR, PDR].
6.3.3.3
a. For software in which numerical accuracy is relevantto mission success
specific rules on design and code shall be defined to ensure that the
specifiedlevelofaccuracyisobtained.
NOTE For example: for an attitude and orbit control
subsystem, scientific data generation
components.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
6.3.3.4
a. Adherencetodesignstandardsshallbeverified.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.3.3.5
a. The supplier shall define mea ns, criteria and tools to ensure that the
complexityandmodularityofthedesignmeetthequalityrequirements.
b. The design evaluation shall be performed in parallel with the design
process,inordertoprovidefeedbacktothesoftwaredesignteam.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
6.3.3.6
a. Synthesis of the results obtained in the software complexity and
modularity evaluation and corrective actions implemented shall be
describedinthesoftwareproductassurancereports.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.3.3.7
a. The supplier shall review the design documentation to ensure that it
containstheappropriatelevelofinformationformaintenanceactivities.
EXPECTED OUTPUT: The following outputs are expected:
a. Software product assurance plan [PAF, SPAP;
PDR];
b. Software product assurance reports [PAF, -; -].
53
ECSSQST80C
6March2009
6.3.4 Coding
6.3.4.1
a. Coding standards (including consistent naming conventions and
adequatecommentaryrules)shallbespecifiedandobserved.
EXPECTED OUTPUT: Coding standards [PAF, -; PDR].
6.3.4.2
a. Thestandardsshallbeconsistentwiththeproductqualityrequirements.
NOTE Coding standards depend on the software
qualityobjectives(seeclause
5.2.7).
EXPECTED OUTPUT: Coding standards [PAF, -; PDR].
6.3.4.3
a. The tools to be used in implementing and checking conformance with
codingstandardsshallbeidentifiedintheproductassuranceplanbefore
codingactivitiesstart.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
6.3.4.4
a. Coding standards shall be reviewed with the customer to ensure that
theyreflectproductqualityrequirements.
EXPECTED OUTPUT: Coding standards and description of tools
[PAF, -; PDR].
6.3.4.5
a. Useoflowlevelprogramminglanguagesshallbejustified.
EXPECTED OUTPUT: Software development plan [MGT,
SDP; PDR].
6.3.4.6
a. Thesuppliershalldefinemeasurements,criteriaandtools toensurethat
thesoftwarecodemeetsthequalityrequirements.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
b. The code evaluation shall be performed in parallel with the coding
process,inordertoprovidefeedbacktothesoftwareprogrammers.
6.3.4.7
a. Synthesisofthecodeanalysisresultsandcorrectiveactionsimplemented
shallbedescribedinthesoftwareproductassurancereports.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
54
ECSSQST80C
6March2009
6.3.4.8
a. The code shall be put under configuration control immediately after
successfulunittesting.
6.3.5 Testing and validation
6.3.5.1
a. Testingshallbeperformedinaccordancewithastrategyforeachtesting
level (i.e.unit, integration, validation against the technical specification,
validation against the requirements baseline, acceptance), which
includes:
1. thetypesofteststobeperformed;
NOTE Forexample: functional, boundary, performance,
andusabilitytests.
2. the tests to be performed in accordance with the plans and
procedures;
3. the means and organizations to perform assurance function for
testingandvalidation.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR, CDR].
6.3.5.2
a. Based on the criticality of the software, test coverage goals for each
testing level shall be agreedbetween the customer and the supplierand
theirachievementmonitoredbymetrics:
1. forunitleveltesting;
2. forintegrationleveltesting;
3. for validation against the technical specification and validation
againsttherequirementsbaseline.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR, CDR].
6.3.5.3
a. The supplier shall ensure through internal review that the test
procedures and data are adequate, feasible and traceable and that they
satisfytherequirements.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.3.5.4
a. Test readiness reviews shall be held before the commencement of test
activities,asdefinedinthesoftwaredevelopmentplan.
EXPECTED OUTPUT: Test readiness review reports [DJF, -; TRR].
55
ECSSQST80C
6March2009
6.3.5.5
a. Testcoverageshallbecheckedwithrespecttothestatedgoals.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
b. Feedback from the results of test coverage evaluation shall be
continuouslyprovidedtothesoftwaredevelopers.
6.3.5.6
a. The supplier shall ensure that nonconformances and software problem
reportsdetectedduringtestingareproperlydocumentedandreportedto
thoseconcerned.
EXPECTED OUTPUT: Nonconformance reports and software
problem reports [DJF, -; CDR, QR, AR,
ORR].
6.3.5.7
a. Thetestcoverageofconfigurablecodeshallbecheckedtoensurethatthe
statedrequirementsaremetineachtestedconfiguration.
EXPECTED OUTPUT: Statement of compliance with test plans and
procedures [PAF, -; CDR, QR, AR, ORR].
6.3.5.8
a. Thecompletion ofactionsrelatedtosoftwareproblemreportsgenerated
duringtestingandvalidationshallbeverifiedandrecorded.
EXPECTED OUTPUT: Software problem reports [DJF, -; SRR,
PDR, CDR, QR, AR, ORR].
6.3.5.9
a. Provisionsshallbemadetoallowwitnessingoftestsbythecustomer.
6.3.5.10
a. Provisions shall be made to allow witnessing of tests by supplier
personnelindependentofthedevelopment.
NOTE For example: specialist software product
assurancepersonnel.
6.3.5.11
a. Thesuppliershallensurethat:
1. tests are conducted in accordance with approved test procedures
anddata,
2. theconfigurationundertestiscorrect,
3. thetestsareproperlydocumented,and
4. thetestreportsareuptodateandvalid.
EXPECTED OUTPUT: Statement of compliance with test plans and
procedures [PAF, -; CDR, QR, AR, ORR].
56
ECSSQST80C
6March2009
6.3.5.12
a. The supplier shall ensure that tests are repeatable by verifying the
storage and recording of tested software, support software, test
environment,supportingdocumentsandproblemsfound.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
6.3.5.13
a. The supplier shall confirm in writing that the tests are successfully
completed.
EXPECTED OUTPUT: Testing and validation reports [DJF, -; CDR,
QR, AR, ORR].
6.3.5.14
a. Review boards looking to engineering and product assurance aspects
shall be convened after the completion of test phases, as defined in the
softwaredevelopmentplan.
6.3.5.15
a. Areas affected by any modification shall be identified and retested
(regressiontesting).
6.3.5.16
a. Incaseofretesting,alltestrelateddocumentation(testprocedures,data
andreports)shallbeupdatedaccordingly.
EXPECTED OUTPUT: Updated test documentation [DJF, -; CDR,
QR, AR, ORR].
6.3.5.17
a. Theneedforregressiontestingandadditionalverificationofthesoftware
shallbeanalysedafteranychangeoftheplatformhardware.
EXPECTED OUTPUT: Updated test documentation [DJF, -; CDR,
QR, AR, ORR].
6.3.5.18
a. Theneedforregressiontestingandadditionalverificationofthesoftware
shallbeanalysedafterachangeorupdateofanytoolusedtogenerateit.
NOTE Forexample:sourcecodeorobjectcode.
EXPECTED OUTPUT: Updated test documentation [DJF, -; CDR,
QR, AR, ORR].
6.3.5.19
a. Validation shall be carried out by staff who have not taken part in the
designorcodingofthesoftwarebeingvalidated.
57
ECSSQST80C
6March2009
NOTE This can be achieved at the level of the whole
software product, or on a component by
componentbasis.
6.3.5.20
a. Validationoftheflightsoftwareagainst the requirement baseline on the
flightequipmentmodelshallbeperformedonasoftwareversionwithout
anypatch.
6.3.5.21
a. Thesuppliershall reviewthetestdocumentationtoensurethatitisupto
dateandorganizedtofacilitateitsreuseformaintenance.
6.3.5.22
a. Tests shall be organized as activities in their own right in terms of
planning,resourcesandteamcomposition.
EXPECTED OUTPUT: Test and validation documentation [DJF,
SValP; PDR], [DJF, SUITP; PDR, CDR].
6.3.5.23
a. The necessary resources for testing shall be identified early in the life
cycle,takingintoaccounttheoperatingandmaintenancerequirements.
EXPECTED OUTPUT: Test and validation documentation [DJF,
SValP; PDR], [DJF, SUITP; PDR, CDR].
6.3.5.24
a. Test tool development or acquisition (hardware and software) shall be
plannedforintheoverallprojectplan.
EXPECTED OUTPUT: Test and validation documentation [DJF,
SValP; PDR], [DJF, SUITP; PDR, CDR].
6.3.5.25
a. The supplier shall establish and review the test procedures and data
beforestartingtestingactivitiesandalsodocumenttheconstraintsofthe
tests concerning physical, performance, functional, controllability and
observabilitylimitations.
EXPECTED OUTPUT: Test and validation documentation [DJF,
SValP; PDR], [DJF, SVS; CDR, QR, AR],
[DJF, SUITP; PDR, CDR].
6.3.5.26
a. Before offering the product for delivery and customer acceptance, the
supplier shall validate its operation as a complete product, under
conditions similar to the application environment as specified in the
requirementsbaseline.
58
ECSSQST80C
6March2009
6.3.5.27
a. When testing under the operational environment is performed, the
followingconcernsshallbeaddressed:
1. thefeaturestobetestedintheoperationalenvironment;
2. the specific responsibilities of the supplier and customer for
carryingoutandevaluatingthetest;
3. restorationofthepreviousoperationalenvironment(aftertest).
EXPECTED OUTPUT: Test and validation documentation [DJF, -;
AR].
6.3.5.28
a. Independentsoftwarevalidationshallbeperformedbyathirdparty.
NOTE This requirement is applicable where the risks
associated with the project justify the costs
involved. The customer can consider a less
rigorous level of independence, e.g. an
independentteaminthesameorganization.
EXPECTED OUTPUT: The following outputs are expected:
a. ISVV plan [DJF, -; SRR, PDR];
b. ISVV report [DJF, -; PDR, CDR, QR, AR].
6.3.5.29
a. The validation shall include testing in the different configurations
possible or in a representative set of them when it is evident that the
numberofpossibleconfigurationsistoohightoallowvalidationinallof
them.
EXPECTED OUTPUT: Test and validation documentation [DJF,
SValP; PDR], [DJF, SVS; CDR, QR, AR].
6.3.5.30
a. Software containing deactivated code shall be validated specifically to
ensurethatthedeactivatedcodecannotbeactivatedorthatitsaccidental
activationcannotharmtheoperationofthesystem.
EXPECTED OUTPUT: Testing and validation reports [DJF, -; CDR,
QR, AR].
6.3.5.31
a. Software containing configurable code shall be validated specifically to
ensure that unintendedconfigurationcannot be activated at runtime or
includedduringcodegeneration.
EXPECTED OUTPUT: Testing and validation reports [DJF, -; CDR,
QR, AR].
59
ECSSQST80C
6March2009
6.3.5.32
a. Activitiesforthevalidationofthequalityrequirementsshallbespecified
inthedefinitionofthevalidationspecification.
EXPECTED OUTPUT: Software validation specification [DJF, SVS;
CDR, QR, AR].
6.3.6 Software delivery and acceptance
6.3.6.1
a. The roles, responsibilities and obligations of the supplier and customer
duringinstallationshallbeestablished.
EXPECTED OUTPUT: Installation procedure [DDF, SCF; AR].
6.3.6.2
a. The installation shall be performed in accordance with the installation
procedure.
6.3.6.3
a. The customer shall establish an acceptance test plan specifying the
intended acceptance tests including specific tests suited to the target
environment(seeECSSEST40clause5.7.3.1).
NOTE1 Theacceptancetestscanbepartlymadeupoftests
usedduringprevioustestactivities.
NOTE2 The acceptance test plan takes into account the
requirement for operational demonstration, either
aspartofacceptanceorafteracceptance.
EXPECTED OUTPUT: Acceptance test plan [DJF, -; QR, AR].
6.3.6.4
a. The customer shall ensure that the acceptance tests are performed in
accordance with the approved acceptance test plan (see ECSSEST40
clause5.7.3.2).
6.3.6.5
a. Before the software is presented for customer acceptance, the supplier
shallensurethat:
1. thedeliveredsoftwarecomplieswiththecontractualrequirements
(including any specified content of the software acceptance data
package);
2. thesourceandobjectcodesuppliedcorrespondtoeachother;
3. allagreedchangesareimplemented;
4. allnonconformancesareeitherresolvedordeclared.
60
ECSSQST80C
6March2009
6.3.6.6
a. Thecustomershallverifythattheexecutablecodewasregeneratedfrom
configuration managed source code components and installed in
accordancewithpredefinedproceduresonthetargetenvironment.
6.3.6.7
a. Any discovered problems shall be documented in nonconformance
reports.
EXPECTED OUTPUT: Nonconformance reports [DJF, -; AR].
6.3.6.8
a. Oncompletionoftheacceptancetests,areportshallbedrawnupandbe
signed by the supplier’s representatives, the customer’s representatives,
the software quality engineers of both parties and the representative of
theorganizationchargedwiththemaintenanceofthesoftwareproduct.
EXPECTED OUTPUT: Acceptance test report [DJF, -; AR].
6.3.6.9
a. The customer shall certify conformance to the procedures and state the
conclusionconcerningthetest resultforthesoftwareproductundertest
(accepted,conditionallyaccepted,rejected).
EXPECTED OUTPUT: Acceptance test report [DJF, -; AR].
6.3.7 Operations
6.3.7.1
a. During operations, the quality of the mission products related to
softwareshallbeagreedwiththecustomerandusers.
NOTE Quality of mission products can include
parameters suchas: errorfreedata,availability
of data and permissible outages; permissible
informationdegradation.
EXPECTED OUTPUT: Software operation support plan [OP, -; ORR].
6.3.7.2
a. During the demonstration that the software conforms to the operational
requirements,thefollowingshallbecoveredasaminimum:
1. availability and maintainability of the host system (including
rebootaftermaintenanceinterventions);
2. safetyfeatures;
3. humancomputerinterface;
4. operatingprocedures;
5. abilitytomeetthemissionproductqualityrequirements.
EXPECTED OUTPUT: Validation of the operational requirements
[PAF, -; ORR].
61
ECSSQST80C
6March2009
6.3.7.3
a. The product assurance plan for system operations shall include
considerationofsoftware.
EXPECTED OUTPUT: Input to product assurance plan for systems
operation [PAF, -; ORR]
6.3.8 Maintenance
6.3.8.1
a. Theorganizationresponsibleformaintenanceshallbeidentifiedtoallow
asmoothtransitionintotheoperationsandmaintenance.
NOTE An organization, with representatives from
both supplier and customer, can be set up to
supportthemaintenanceactivities.Attention is
drawntotheimportanceoftheflexibilityofthis
organization to cope with the unexpected
occurrenceofproblemsandtheidentificationof
facilities and resources to be used for the
maintenanceactivities.
EXPECTED OUTPUT: Maintenance plan [MF, -; QR, AR, ORR].
6.3.8.2
a. The maintenance organization shall specify the assurance, verification
andvalidationactivitiesapplicabletomaintenanceinterventions.
EXPECTED OUTPUT: Maintenance plan [MF, -; QR, AR, ORR].
6.3.8.3
a. The maintenance plans shall be verified against specified requirements
formaintenanceofthesoftwareproduct.
NOTE The maintenance plans and procedures can
address corrective, improving, adaptive and
preventive maintenance, differentiating
between “routine” and “emergency”
maintenanceactivities.
6.3.8.4
a. The maintenance plans and procedures shall include the following as a
minimum:
1. scopeofmaintenance;
2. identificationofthefirstversionofthesoftwareproductforwhich
maintenanceistobedone;
3. supportorganization;
4. maintenancelifecycle;
62
ECSSQST80C
6March2009
5. maintenanceactivities;
6. qualitymeasurestobeappliedduringthemaintenance;
7. maintenancerecordsandreports.
EXPECTED OUTPUT: Maintenance plan [MF, -; QR, AR, ORR].
6.3.8.5
a. Rulesforthesubmissionofmaintenancereportsshallbeestablishedand
agreedaspartofthemaintenanceplan.
EXPECTED OUTPUT: Maintenance plan [MF, -; QR, AR, ORR].
6.3.8.6
a. All maintenance activities shall be logged in predefined formats and
retained.
EXPECTED OUTPUT: Maintenance records [MF, -; -].
6.3.8.7
a. Maintenance records shall be established for each software product,
including,asaminimum,thefollowinginformation,:
1. list of requests for assistance or problem reports that have been
receivedandthecurrentstatusofeach;
2. organization responsible for responding to requests for assistance
orimplementingtheappropriatecorrectiveactions;
3. prioritiesassignedtothecorrectiveactions;
4. resultsofthecorrectiveactions;
5. statisticaldataonfailureoccurrencesandmaintenanceactivities.
NOTE Therecord of the maintenanceactivitiescanbe
utilizedfor evaluationandenhancementof the
software product and for improvement of the
qualitysystemitself.
EXPECTED OUTPUT: Maintenance records [MF, -; -].
63
ECSSQST80C
6March2009
7
Software product quality assurance
7.1 Product quality objectives and metrication
7.1.1 Deriving of requirements
a. The software quality requirements (including safety and dependability
requirements) shall be derived from the requirements defined at system
level.
EXPECTED OUTPUT: The following outputs are expected:
a. Requirement baseline [RB, SSS; SRR];
b. Technical specification [TS, SRS; PDR].
7.1.2 Quantitative definition of quality
requirements
a. Quality requirements shall be expressed in quantitative terms or
constraints.
EXPECTED OUTPUT: The following outputs are expected:
a. Requirement baseline [RB, SSS; SRR];
b. Technical specification [TS, SRS; PDR].
7.1.3 Assurance activities for product quality
requirements
a. The supplier shall define assurance activities to ensure that the product
meetsthequalityrequirementsasspecifiedinthetechnicalspecification.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
7.1.4 Product metrics
a. In order to verify the implementation of the product quality
requirements, the supplier shall define a metrication programme based
ontheidentifiedqualitymodel(seeclause
5.2.7),specifying:
1. themetricstobecollectedandstored;
2. themeanstocollectmetrics(measurements);
64
ECSSQST80C
6March2009
3. the target values, with reference to the product quality
requirements;
4. the analyses to be performed on the collected metrics, including
theonestoderive:
(a) descriptivestatistics;
NOTE For example: the number ofunits at each level
ofcomplexity.
(b) trendanalysis(suchastrendsinsoftwareproblems).
5. howtheresultsoftheanalysesperformedonthecollectedmetrics
are fed back to the development team and used to identify
correctiveactions;
6. the schedule of metrics collection, storing, analysis and reporting,
withreferencetothewholesoftwarelifecycle.
NOTE Guidance for software metrication programme
implementation can be found in ECSSQHB
8004.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
7.1.5 Basic metrics
a. Thefollowingbasicproductsmetricsshallbeused:
1. size(code);
2. complexity(design,code);
3. faultdensityandfailureintensity;
4. testcoverage;
5. numberoffailures.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; SRR, PDR].
7.1.6 Reporting of metrics
a. The results of metrics collection and analysis shall be included in the
software product assurance reports, in order to provide the customer
withaninsightintothelevelofqualityobtained.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
7.1.7 Numerical accuracy
a. Numericalaccuracyshallbeestimatedandverified.
EXPECTED OUTPUT: Numerical accuracy analysis [DJF, SVR;
PDR, CDR, QR].
65
ECSSQST80C
6March2009
7.1.8 Analysis of software maturity
a. The supplier shall define the organization and means implemented to
collectandanalysedatarequiredforthestudyofsoftwarematurity.
NOTE For example: failures, corrections, duration of
runs.
EXPECTED OUTPUT: Software product assurance reports [PAF, -; -].
7.2 Product quality requirements
7.2.1 Requirements baseline and technical
specification
7.2.1.1
a. The software quality requirements shall be documented in the
requirementsbaselineandtechnicalspecification.
EXPECTED OUTPUT: The following outputs are expected:
a. Requirement baseline [RB, SSS; SRR];
b. Technical specification [TS, SRS; PDR].
7.2.1.2
a. Thesoftwarerequirementsshallbe:
1. correct;
2. unambiguous;
3. complete;
4. consistent;
5. verifiable;
6. traceable.
7.2.1.3
a. Foreachrequirementthemethodforverificationandvalidationshallbe
specified.
EXPECTED OUTPUT: The following outputs are expected:
a. Requirement baseline [RB, SSS; SRR];
b. Technical specification [TS, SRS; PDR].
66
ECSSQST80C
6March2009
7.2.2 Design and related documentation
7.2.2.1
a. The software design shall meet the nonfunctional requirements as
documentedinthetechnicalspecification.
7.2.2.2
a. Thesoftwareshallbedesignedtofacilitatetesting.
7.2.2.3
a. Software with a long planned lifetime shall be designed with minimum
dependency on the operating system and the hardware, in order to aid
portability.
NOTE This requirement is applicable to situations
where the software lifetime can lead to the
obsolescence and nonavailability of the
original operating system and/or hardware,
thereby jeopardizing the maintainability the
software.
EXPECTED OUTPUT: The following outputs are expected:
a. Software product assurance plan [PAF, SPAP;
SRR, PDR];
b. Justification of design choices [DDF, SDD;
PDR, CDR].
7.2.3 Test and validation documentation
7.2.3.1
a. Detailed test and validation documentation (data, procedures and
expected results) defined in the ECSSEST40 DJF shall be consistent
withthe defined test and validation strategy (see clause
6.3.5and ECSS
EST40clauses5.5.3,5.5.4,5.6and5.8).
7.2.3.2
a. The test documentation shall cover the test environment, tools and test
software,personnelrequiredandassociatedtrainingrequirements.
7.2.3.3
a. Thecriteriaforcompletionofeachtestandanycontingencystepsshallbe
specified.
7.2.3.4
a. Testprocedures,dataandexpectedresultsshallbespecified.
67
ECSSQST80C
6March2009
7.2.3.5
a. The hardware and software configuration shall be identified and
documentedaspartofthetestdocumentation.
7.2.3.6
a. Foranyrequirementsnotcoveredbytestingaverificationreportshallbe
drawn up documenting or referring to the verification activities
performed.
EXPECTED OUTPUT: Software verification report [DJF, SVR;
CDR, QR, AR].
7.3 Software intended for reuse
7.3.1 Customer requirements
a. For the development of software intended for reuse, ECSSEST40
clauses5.2.4.7and5.4.3.6shallapply.
7.3.2 Separate documentation
a. The information relatedto the components developed for reuse shall be
separated from the others in the technical specification, design
justificationfile,designdefinitionfileandproductassurancefile.
7.3.3 Self-contained information
a. The information related to components developed for reuse in the
technical specification, the design justification file, the design definition
fileandtheproductassurancefileshallbeselfcontained.
7.3.4 Requirements for intended reuse
a. The technical specification of components developed for reuse shall
include requirements for maintainability, portability and verification of
thosecomponents.
EXPECTED OUTPUT: Technical specification for reusable
components [TS, -; PDR].
7.3.5 Configuration management for intended
reuse
a. The configuration management system shall include provisions for
handlingspecificaspectsofsoftwaredevelopedforreuse,suchas:
longer lifetime of the components developed for reuse compared
totheothercomponentsoftheproject;
68
ECSSQST80C
6March2009
evolution or change of the development environmentfor the next
projectthatintendstousethecomponents;
transfer of the configuration and documentation management
informationtothenextprojectreusingthesoftware.
EXPECTED OUTPUT: Software configuration management plan
[MGT, SCMP; SRR, PDR].
7.3.6 Testing on different platforms
a. Wherethecomponentsdevelopedforreusearedevelopedtobereusable
ondifferentplatforms,thetesting of thesoftwareshall be performed on
allthoseplatforms.
EXPECTED OUTPUT: Verification and validation documentation
for reusable components [DJF, -; CDR].
7.3.7 Certificate of conformance
a. Thesuppliershallprovideacertificateofconformancethatthetestshave
beensuccessfullycompletedonalltherelevantplatforms.
NOTE In case not all platforms are available, the
certificateof conformance states the limitations
ofthevalidationperformed.
EXPECTED OUTPUT: Verification and validation documentation
for reusable components [DJF, -; CDR].
7.4 Standard ground hardware and services for
operational system
7.4.1 Hardware procurement
a. The subcontracting and procurement of hardware shall be carried out
accordingtotherequirementsofECSSQST20clause7.
EXPECTED OUTPUT: The following outputs are expected:
a. Justification of selection of operational ground
equipment [DJF, -; SRR, PDR];
b. Receiving inspection reports [PAF, -; SRR,
PDR].
7.4.2 Service procurement
a. The procurement of support services to be used in operational phases
shallbejustifiedascoveringservicelevelagreements,qualityofservices
and escalation procedures, as needed for system exploitation and
maintenance.
EXPECTED OUTPUT: Justification of selection of operational
support services [DJF, -; SRR, PDR].
69
ECSSQST80C
6March2009
7.4.3 Constraints
a. The choice of procured hardware and services shall address the
constraints associated with both the development and the actual use of
thesoftware.
EXPECTED OUTPUT: Justification of selection of operational
ground equipment [DJF, -; SRR, PDR].
7.4.4 Selection
a. The ground computer equipment and supporting services for
implementing the finalsystem shall be selected according to the project
requirementsregarding:
1. performance;
2. maintenance;
3. durability and technical consistency with the operational
equipment;
4. the assessment of the product with respect to requirements,
includingthecriticalitycategory;
5. theavailablesupportdocumentation;
6. theacceptanceandwarrantyconditions;
7. theconditionsofinstallation,preparation,traininganduse;
8. the maintenance conditions, including the possibilities of
evolutions;
9. copyrightconstraints;
10. availability;
11. compatibility;
12. siteoperationalconstraints.
EXPECTED OUTPUT: Justification of selection of operational
ground equipment [DJF, -; SRR, PDR].
7.4.5 Maintenance
a. Takingaccountoftheprovider’smaintenanceandproductpolicy,itshall
be ensured that the hardware and support services can be maintained
throughout the specified life of the software product within the
operationalconstraints.
7.5 Firmware
7.5.1 Device programming
a. The supplier shall establish procedures for firmware device
programmingandduplicationoffirmwaredevices.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
70
ECSSQST80C
6March2009
7.5.2 Marking
a. Thefirmwaredeviceshallbeindeliblymarkedtoallowtheidentification
(by reference) of the hardware component and of the software
component.
EXPECTED OUTPUT: Software product assurance plan [PAF,
SPAP; PDR].
7.5.3 Calibration
a. The supplier shall ensure that the firmware programming equipment is
calibrated.
71
ECSSQST80C
6March2009
72
Annex A (informative)
Software documentation
PAF
Product Assurance
File
Thisannexdefines the structure of the softwaredocumentstobeproduced,as
depictedin
FigureA1.
Software configuration
management plan
(DRD in ECSS-M-ST-40)
Software review plan
...
Software system
specification
Software interface
requirements docum
...
Software product
assurance plan
ent
Software design
document
Software configuration file
(DRD in ECSS-M-ST-40)
Software release
document
...
Software user manual
Maintenance plan
(without DRD)
Migration plan
(without DRD)
...
TS
Technical
Specification
DJF
Design Justification
File
OP
Operational
Software product assurance
milestone report
Software product assurance
requirements for suppliers
...
Software requirem
specification
Interface control
document
...
ents
Software validation plan
Software verification plan
Software unit and
integration plan
...
Software reuse file
O
p
erational
p
lan
Operational testing
specification
...
Software development
plan
MF
Maintenance
MGT
Management File
File
Table A1 represents the document requirements list, identifying the software
documentationtobeproduced inaccordancewiththerequirementsdefinedin
thisStandardandinECSSEST40.
FigureA1:Overviewofsoftwaredocuments
DDF
Design Definition
File
Software validation
specification
Software verification report
RB
Requirements
Baseline
ECSSQST80C
6March2009
TableA1:ECSSEST40andECSSQST80Documentrequirementslist(DRL)
Related
file
DRLitem
(e.g.Plan,document,file,report,form,matrix)
DRLitemhavinga
DRD
SRR PDR CDR QR AR ORR
Softwaresystemspecification(SSS) ECSSEST40AnnexB
9
Interfacerequirementsdocument(IRD) ECSSEST40AnnexC
9
RB
Safetyanddependabilityanalysisresultsforlowerlevel
suppliers
‐
9
Softwarerequirementsspecification(SRS) ECSSEST40AnnexD
9
TS
Softwareinterfacecontroldocument(ICD) ECSSEST40AnnexE
9
9
Softwaredesigndocument(SDD) ECSSEST40AnnexF
9
9
Softwareconfigurationfile(SCF) ECSSMST40AnnexE
9
9
9
9
9
Softwarereleasedocument(SRelD) ECSSEST40AnnexG
9
9
Softwareusermanual(SUM) ECSSEST40AnnexH
9
9
9
Softwaresourcecodeandmedialabels

9
Softwareproductandmedialabels‐
9
9
9
DDF
Trainingmaterial ‐
9
Softwareverificationplan(SVerP) ECSSEST40AnnexI
9
Softwarevalidationplan(SValP) ECSSEST40AnnexJ
9
Independentsoftwareverification&validationplan ‐
9
9
DJF
Softwareintegrationtestplan(SUITP) ECSSEST40CAnnexK
9
9
73
ECSSQST80C
6March2009
Related
file
DRLitem
(e.g.Plan,document,file,report,form,matrix)
DRLitemhavinga
DRD
SRR PDR CDR QR AR ORR
Softwareunittestplan(SUITP) ECSSEST40AnnexK

9
Softwarevalidationspecification(SVS)withrespecttoTS ECSSEST40AnnexL

9
Softwarevalidationspecification(SVS)withrespecttoRB ECSSEST40AnnexL
9
9
Acceptancetestplan ‐
9
9
Softwareunittestreport ‐
9

Softwareintegrationtestreport ‐
9

SoftwarevalidationreportwithrespecttoTS‐

9
SoftwarevalidationreportwithrespecttoRB‐
9
9
Acceptancetestreport ‐
9
Installationreport ‐
9
Softwareverificationreport(SVR) ECSSEST40AnnexM
9
9
9
9
9
9
Independentsoftwareverification&validationreport ‐
9
9
9
9
9
Softwarereusefile(SRF) ECSSEST40AnnexN
9
9
9

Softwareproblemreportsandnonconformancereports‐
9
9
9
9
9
9
Jointreviewreport‐
9
9
9
9
9
Justificationofselectionofoperationalgroundequipment
andservices
‐
9
9

74
ECSSQST80C
6March2009
Related
file
DRLitem
(e.g.Plan,document,file,report,form,matrix)
DRLitemhavinga
DRD
SRR PDR CDR QR AR ORR
Softwaredevelopmentplan(SDP) ECSSEST40AnnexO
9
9

Softwarereviewplan(SRevP) ECSSEST40AnnexP
9
9

Softwareconfigurationmanagementplan ECSSMST40AnnexA
9
9

Trainingplan‐
9
Interfacemanagementprocedures‐
9
IdentificationofNRBSWmembers‐
9
MGT
Procurementdata

9
9
Maintenanceplan ‐
9
9
9
Maintenancerecords‐
9
9
9
SPRandNCR‐Modificationanalysisreport‐Problem
analysisreport‐Modificationidentification
‐

Migrationplanandnotification‐

MF
Retirementplanandnotification‐

Softwareoperationsupportplan ‐
9
Operationaltestingresults‐
9
OP
SPRandNCR‐User’srequestrecordsoftwareproduct‐Post
operationreviewreport
‐
9
75
ECSSQST80C
6March2009
Related
file
DRLitem
(e.g.Plan,document,file,report,form,matrix)
DRLitemhavinga
DRD
SRR PDR CDR QR AR ORR
Softwareproductassuranceplan(SPAP) ECSSQST80AnnexB
9
9
9
9
9
9
Softwareproductassurancerequirementsforsuppliers ‐
9

Auditplanandschedule‐

9
Reviewandinspectionplansorprocedures‐

Proceduresandstandards‐

9
Modellinganddesignstandards

9
9
Codingstandardsanddescriptionoftools‐

9
Softwareproblemreportingprocedures‐

9
Softwaredependabilityandsafetyanalysisreport‐
Criticalityclassificationofsoftwarecomponents
‐
9
9
9
9
Softwareproductassurancereports

Softwareproductassurancemilestonereport(SPAMR) ECSSQST80AnnexC
9
9
9
9
9
9
Statementofcompliancewithtestplansandprocedures‐
9
9
9
9
Recordsoftrainingandexperience ‐

(Preliminary)alertinformation‐
Resultofpreawardauditsandassessments,andof
procurementsources
‐
Softwareprocessassessmentplan‐
Softwareprocessassessmentrecords‐
PAF
Reviewandinspectionreports‐

76
ECSSQST80C
6March2009
77
Related
file
DRLitem
(e.g.Plan,document,file,report,form,matrix)
DRLitemhavinga
DRD
SRR PDR CDR QR AR ORR
Receivinginspectionreports‐

9
9
9
9
Inputtoproductassuranceplanforsystemsoperation‐
9
ECSSQST80C
6March2009
Annex B (normative)
Software product assurance plan (SPAP) -
DRD
B.1 DRD identification
B.1.1 Requirement identification and source document
The software product assurance plan (SPAP) is called from the normative
provisionssummarizedin
TableB1.
TableB1:SPAPtraceabilitytoECSSEST40andECSSQST80clauses
ECSSStandard Clause DRDsection
5.1.2.1 <5>.a,<5>.b
5.1.2.2 <5>.a,<5>.b
ECSSQST80
5.1.2.3 <5>.a
5.1.3.1 <5>.c
5.1.4.1 <5>.b
5.2.1.1 All
5.2.1.3 All
5.2.1.4 <6>.g.7
5.2.1.5 <8>
5.2.6.2 <6>.d
5.2.7.2 <5>.e
5.4.3.3 All
5.4.3.4 All
5.6.1.1 <5>.h
6.1.1 <6>.a
6.1.5 <6>.a.3
6.2.1.4 <6>.b
6.2.3.1 <6>.c
6.2.3.2 <6>.c
6.2.3.4 <6>.c
78
ECSSQST80C
6March2009
ECSSStandard Clause DRDsection
6.2.3.5 <6>.c
6.2.4.8 <6>.d
6.2.4.9 <6>.d
6.2.4.11 <6>.d
6.2.5.1 <6>.e
6.2.5.2 <6>.e
6.2.7.2 <6>.f
6.2.7.3 <6>.f
6.2.7.4 <6>.f
6.2.7.5 <6>.f
6.3.3.3 <6>.h.3
6.3.3.5 <6>.g.2
6.3.3.7 <6>.g.2
6.3.4.3 <6>.h.2
6.3.4.6 <6>.g.3
6.3.5.1 <6>.g.4
6.3.5.2 <6>.g.4
7.1.3 <7>.b.4
7.1.5 <7>.b
7.1.6 <7>.b
7.2.2.3 <7>.a
7.5.1 <6>.h.3
7.5.2 <6>.h.3
B.1.2 Purpose and objective
The software product assurance plan is a constituent of the productassurance
file(PAF).
The purpose of the software product assurance plan is to provideinformation
ontheorganizationalaspectsandthetechnicalapproachtotheexecutionofthe
softwareproductassuranceprogramme
79
ECSSQST80C
6March2009
B.2 Expected response
B.2.1 Scope and content
<1> Introduction
a. The SPAP shall contain a description of the purpose, objective, content
andthereasonpromptingitspreparation.
<2> Applicableandreferencedocuments
a. The SPAP shall list the applicable and reference documents to support
thegenerationofthedocument.
<3> Terms,definitionsandabbreviatedterms
a. The SPAP shall include any additional terms, definition or abbreviated
termsused.
<4> SystemOverview
a. The SPAP shall include or refer to a description of the system and
softwareproductsbeingdeveloped.
<5> Softwareproductassuranceprogrammeimplementation
<5.1> Organization
a. The SPAP shall describe the organization of software productassurance
activities, including responsibility, authority and the interrelation of
personnel who manage, perform and verify work affecting software
quality.
b. Thefollowingtopicsshallbeincluded:
1. organizationalstructure;
2. interfacesofeachorganisation,eitherexternalorinternal,involved
intheproject;
3. relationshiptothesystemlevelproductassuranceandsafety;
4. independenceofthesoftwareproductassurancefunction;
5. delegation of software product assurance tasks to a lower level
supplier,ifany.
<5.2> Responsibilities
a. The SPAP shall describe the responsibilities of the software product
assurancefunction.
80
ECSSQST80C
6March2009
<5.3> Resources
a. TheSPAPshalldescribetheresourcestobeusedtoperformthesoftware
productassurancefunction.
b. Thedescription in
B.2.1<5.3>a.shall includehumanresources and skills,
hardwareandsoftwaretools.
<5.4> Reporting
a. The SPAP shall describe the reporting to be performed by software
productassurance.
<5.5> Qualitymodels
a. TheSPAPshalldescribethequalitymodelsapplicabletotheprojectand
howtheyareusedtospecifythequalityrequirements.
<5.6> Riskmanagement
a. The SPAP shall describe the contribution of the software product
assurancefunctiontotheprojectriskmanagement.
<5.7> Supplierselectionandcontrol
a. The SPAP shall describe the contribution of the software product
assurancefunctiontothenextlevelsuppliersselectionandcontrol.
<5.8> Methodsandtools
a. TheSPAPshalldescribe themethodsandtoolsusedforalltheactivities
ofthedevelopmentcycle,andtheirlevelofmaturity.
<5.9> Processassessmentandimprovement
a. 1. TheSPAPshallstatethescopeandobjectivesofprocessassessment.
b. 2. TheSPAPshalldescribethemethodsandtoolstobeusedforprocess
assessmentandimprovement.
<5.10> Operationsandmaintenance(optional)
a. TheSPAPshallspecifythequalitymeasuresrelatedtotheoperationsand
maintenanceprocesses(alternatively,aseparateSPAPisproduced).
<6> Softwareprocessassurance
<6.1> Softwaredevelopmentcycle
a. TheSPAPshallrefertothesoftwaredevelopmentcycledescriptioninthe
softwaredevelopmentplan.
b. If not covered in the software development plan, the life cycle shall be
described.
c. Thelifecycleshallincludeamilestoneimmediatelybeforethestartingof
thesoftwarevalidation.
81
ECSSQST80C
6March2009
<6.2> Projectsplans
a. TheSPAPshalldescribeallplanstobeproducedandusedintheproject.
b. The relationship between the project plans and a timely planning for
theirpreparationandupdateshallbedescribed.
<6.3> Softwaredependabilityandsafety
a. TheSPAPshallcontainadescriptionandjustificationofthemeasuresto
beappliedforthehandlingofcriticalsoftware,includingtheanalysesto
beperformedandthestandardsapplicableforcriticalsoftware.
<6.4> Softwaredocumentationandconfigurationmanagement
a. The SPAP shall describe the contribution of the software product
assurance function to the proper implementation of documentation and
configurationmanagement.
b. Thenonconformancecontrolsystemshallbedescribedorreferenced.The
point in the software life cycle from which the nonconformance
proceduresapplyshallbespecified.
c. The SPAP shall identify method and tool to protect the supplied
software, a checksumtype key calculation for the delivered operational
software,andalabellingmethodforthedeliveredmedia.
<6.5> Processmetrics
a. The SPAP shall describe the process metrics derived from the defined
qualitymodels,themeanstocollect,storeandanalyzethem,andtheway
theyareusedtomanagethedevelopmentprocesses.
<6.6> Reuseofsoftware
a. TheSPAPshalldescribetheapproach for the reuseof existingsoftware,
includingdeltaqualification.
<6.7> Productassuranceplanningforindividualprocessesand
activities
a. The following processes and activities shall be covered, taking into
accounttheprojectscopeandlifecycle:
1. softwarerequirementsanalysis;
2. softwarearchitecturaldesignanddesignofsoftwareitems;
3. coding;
4. testingandvalidation(includingregressiontesting);
5. verification;
6. softwaredeliveryandacceptance;
7. operationsandmaintenance.
<6.8> Proceduresandstandards
a. TheSPAPshalldescribeorlistbyreferenceallproceduresandstandards
applicabletothedevelopmentofthesoftwareintheproject.
82
ECSSQST80C
6March2009
b. The software product assurance measures to ensure adherence to the
projectproceduresandstandardsshallbedescribed.
c. The standards and procedures to be described or listed in accordance
with
B.2.1<6.8>a shall be as a minimum those covering the following
aspects:
1. projectmanagement;
2. riskmanagement;
3. configurationanddocumentationmanagement;
4. verificationandvalidation;
5. requirementsengineering;
6. design;
7. coding;
8. metrication;
9. nonconformancecontrol;
10. audits;
11. alerts;
12. procurement;
13. reuseofexistingsoftware;
14. useofmethodsandtools;
15. numericalaccuracy;
16. delivery,installationandacceptance;
17. operations;
18. maintenance;
19. deviceprogrammingandmarking.
<7> Softwareproductqualityassurance
a. The SPAP shall describe the approach taken to ensure the qualityof the
softwareproduct.
b. Thedescriptionoftheapproachspecifiedin
B.2.1<7>ashallincludethe:
1. specification of the product metrics, their target values and the
meanstocollectthem;
2. definitionofatimelymetricationprogramme;
3. analysestobeperformedonthecollectedmetrics;
4. waytheresultsarefedbacktothedevelopmentteam;
5. documentationqualityrequirements;
6. assurance activities meant to ensure that the product meets the
qualityrequirements.
<8> Compliancematrixtosoftwareproductassurance
requirements
a. TheSPAPshallincludethecompliancematrixtotheapplicablesoftware
product assurance requirements (e.g. ECSSQST80 clauses, as tailored
by a product assurance requirements document), or provide a reference
toit.
83
ECSSQST80C
6March2009
b. For each software product assurance requirement, the following
informationshallbeprovided:
1. requirementidentifier;
2. compliance
(C=compliant,NC=non–compliant,NA=notapplicable);
3. reference to the project documentation covering the requirement
(e.g.sectionofthesoftwareproductassuranceplan);
4. remarks.
B.2.2 Special remarks
The response to this DRD may be combined with the response to the project
productassuranceplan,asdefinedinECSSQST10.
84
ECSSQST80C
6March2009
Annex C (normative)
Software product assurance milestone
report (SPAMR) - DRD
A-A-
C.1 DRD identification
C.1.1 Requirement identification and source document
The software product assurance milestone report (SPAMR) is called from the
normativeprovisionssummarizedin
TableC1.
TableC1:SPAPtraceabilitytoECSSEST40andECSSQST80clauses
ECSSStandard Clause DRDsection
5.2.2.3 All ECSS-Q-ST-80
5.6.1.2 <5>.1
5.6.1.3 <5>.2
6.2.5.4 <7>
6.2.5.5 <7>
6.2.6.3 <4>
6.2.6.7 <4>
6.2.8.5 <6>
6.3.3.4 <6>
6.3.3.6 <6>.1
6.3.3.7 <6>.2
6.3.4.7 <6>
6.3.5.3 <8>
6.3.5.5 <8>
6.3.5.12 <8>
7.1.6 <7>
7.1.8 <7>
85
ECSSQST80C
6March2009
C.1.2 Purpose and objective
Thesoftwareproductassurancemilestonereportisaconstituentoftheproduct
assurancefile(PAF).
The main purpose of the software product assurance milestone report is to
collectandpresentatprojectmilestones the reportingonthesoftwareproduct
assuranceactivitiesperformedduringthepastprojectphases.
C.2 Expected response
C.2.1 Scope and content
<1> Introduction
a. TheSPAMRshallcontainadescriptionofthepurpose,objective,content
andthereasonpromptingitspreparation.
<2> Applicableandreferencedocuments
a. TheSPAMRshalllisttheapplicableandreferencedocumentstosupport
thegenerationofthedocument.
<3> Terms,definitionsandabbreviatedterms
a. TheSPAMRshallincludeanyadditionalterms,definitionorabbreviated
termsused.
<4> Verificationactivitiesperformed
a. The SPAMR shall contain reporting on verification activities performed
bytheproductassurancefunction,including:
1. reviews;
2. inspections;
3. walkthroughs;
4. reviewoftraceabilitymatrices;
5. documentsreviewed.
b. The SPAMR shall contain reporting on the verification of the measures
appliedforthehandlingofcriticalsoftware.
<5> Methodsandtools
a. The SPAMR shall include or reference a justificationof the suitabilityof
the methods and tools applied in all the activities of the development
cycle, including requirements analysis, software specification, design,
coding, validation, testing, configuration management, verification and
productassurance.
b. The SPAMR shall include reporting on the correct use of methods and
tools.
86
ECSSQST80C
6March2009
<6> Adherencetodesignandcodingstandards
a. The SPAMR shall include reporting on the adherence of software
products to the applicable modelling, design and coding standards,
including:
1. reportingontheapplicationof measures meant to ensurethat the
designcomplexityandmodularitymeetthequalityrequirements;
2. reporting on design documentation w.r.t. suitability for
maintenance.
<7> Productandprocessmetrics
a. TheSPAMRshallincludereportingonthecollectedproductandprocess
metrics, the relevant analyses performed, the corrective actions
undertakenandthestatusoftheseactions.
b. Theresultsofthesoftwarematurityanalysisshallalsobereported.
<8> Testingandvalidation
a. The SPAMR shall include reporting on adequacy of the testing and
validation documentation (including feasibility, traceability
repeatability),andontheachievedtestcoveragew.r.t.statedgoals.
<9> SPRsandSWNCRs
a. The SPAMR shall include reporting on the status of software problem
reportsandnonconformancesrelevanttosoftware.
<10> Referencestoprogressreports
a. Whenever relevant and uptodate information has been already
delivered as part of the regular PA progress reporting, a representative
summary shall be provided, together with a detailed reference to the
progressreport(s)containingthatinformation.
C.2.2 Special remarks
The response to this DRD may be combined with the response to the project
productassurancereport,asdefinedinECSSQST10.
87
ECSSQST80C
6March2009
Annex D (normative)
Tailoring of this Standard based on
software criticality
D.1 Software criticality categories
The following software criticalitycategories are defined, basedon theseverity
oftheconsequences ofsystemfailures(ref.ECSSQST40Table51,andECSS
QST30Table61).
TableD1:Softwarecriticalitycategories
Category Definition
Softwarethatifnotexecuted,orifnotcorrectly
executed,orwhoseanomalousbehaviourcancause
orcontributetoasystemfailureresultingin:
A
ÆCatastrophicconsequences
Softwarethatifnotexecuted,orifnotcorrectly
executed,orwhoseanomalousbehaviourcancause
orcontributetoasystemfailureresultingin:
B
ÆCriticalconsequences
Softwarethatifnotexecuted,orifnotcorrectly
executed,orwhoseanomalousbehaviourcancause
orcontributetoasystemfailureresultingin:
C
ÆMajorconsequences
Softwarethatifnotexecuted,orifnotcorrectly
executed,orwhoseanomalousbehaviourcancause
orcontributetoasystemfailureresultingin:
D
ÆMinororNegligibleconsequences
D.2 Applicability matrix
Thefollowingapplicabilitymatrixrepresentsatailoringoftherequirementsof
thisStandardbasedonthesoftwarecriticalitycategoriesdefinedin
D.1.
For each clause of this Standard and for each software criticality category, an
indicationisgiven whether that clause is applicable (Y), not applicable(N), or
applicable under the conditions thereby specified to that software criticality
category.
88
ECSSQST80C
6March2009
TableD2:Applicabilitymatrixbasedonsoftwarecriticality
Clause Description A B C D
‐ ‐ ‐ ‐
5 Softwareproductassurance
programmeimplementation
5.1 Organizationandresponsibility ‐ ‐ ‐ ‐
5.1.1 Organization Y Y Y Y
5.1.2 Responsibilityandauthority ‐ ‐ ‐ ‐
5.1.2.1 Y Y Y Y
5.1.2.2 Y Y Y Y
5.1.2.3 Y Y Y Y
5.1.3 Resources ‐ ‐ ‐ ‐
5.1.3.1 Y Y Y Y
5.1.3.2 Y Y Y N
5.1.4 Softwareproductassurance ‐ ‐ ‐ ‐
manager/engineer
5.1.4.1 Y Y Y Y
5.1.4.2 Y Y Y Y
5.1.5 Training ‐ ‐ ‐ ‐
5.1.5.1 Y Y Y Expectedoutput
notrequired
5.1.5.2 Y Y Y N
5.1.5.3 Y Y Y Y
5.1.5.4 Y Y Y Y
5.2 Softwareproductassuranceprogramme
management
‐ ‐ ‐ ‐
5.2.1 Softwareproductassuranceplanningand
control
‐ ‐ ‐ ‐
5.2.1.1 Y Y Y Y
5.2.1.2 Y Y Y Y
5.2.1.3 Y Y Y Y
5.2.1.4 Y Y Y Y
5.2.1.5 Y Y Y Y
5.2.2 Softwareproductassurancereporting ‐ ‐ ‐ ‐
5.2.2.1 Y Y Y Y
5.2.2.2 Y Y Y Y
5.2.2.3 Y Y Y Y
5.2.3 Audits Y Y Y Auditsplanned
andperformed
onlywhen
necessary
5.2.4 Alerts Y Y Y Y
89
ECSSQST80C
6March2009
Clause Description A B C D
5.2.5 Softwareproblems ‐ ‐ ‐ ‐
5.2.5.1 Y Y Y Y
5.2.5.2 Y Y Y Y
5.2.5.3 Y Y Y Y
5.2.5.4 Y Y Y Y
5.2.6 Nonconformances ‐ ‐ ‐ ‐
5.2.6.1 Y Y Y Y
5.2.6.2 Y Y Y Y
5.2.7 Qualityrequirementsandqualitymodels ‐ ‐ ‐ ‐
5.2.7.1 Y Y Y Y
5.2.7.2 Y Y Y Relevant
characteristics
only(e.g.
suitabilityfor
safetyisnot
relevantforcat.D
software)
5.3 Riskmanagementandcriticalitemcontrol ‐ ‐ ‐ ‐
5.3.1 Riskmanagement Y Y Y Y
5.3.2 Criticalitemcontrol ‐ ‐ ‐ ‐
5.3.2.1 Y Y Y Y
5.3.2.2 Y Y Y Y
5.4 Supplierselectionandcontrol ‐ ‐ ‐ ‐
5.4.1 Supplierselection Y Y Y Y
5.4.1.1 Y Y Y Expectedoutput
notrequired
5.4.1.2 Y Y Y Y
5.4.2 Supplierrequirements ‐ ‐ ‐ ‐
5.4.2.1 Y Y Y Y
5.4.2.2 Y Y Y N
5.4.3 Suppliermonitoring ‐ ‐ ‐ ‐
5.4.3.1 Y Y Y Y
5.4.3.2 Y Y Y Y
5.4.3.3 Y Y Y Y
5.4.3.4 Y Y Y N
5.4.4 Criticalityclassification Y Y Y Y
5.5 Procurement ‐ ‐ ‐ ‐
5.5.1 Procurementdocuments Y Y Y Y
5.5.2 Reviewofprocuredsoftwarecomponent
list
Y Y Y Y
90
ECSSQST80C
6March2009
Clause Description A B C D
5.5.3 Procurementdetails Y Y Y Y
5.5.4 Identification Y Y Y Y
5.5.5 Inspection Y Y Y Y
5.5.6 Exportability Y Y Y Y
5.6 Toolsandsupportingenvironment ‐ ‐ ‐ ‐
5.6.1 Methodsandtools ‐ ‐ ‐
5.6.1.1 Y Y Y Theproposed
methodsand
toolsshallhave
beensuccessfully
usedatleastin
oneprojectbefore
(possiblyanon
spaceproject)
5.6.1.2 Y Y Y Expectedoutput
notrequired
5.6.1.3 Y Y Y Expectedoutput
notrequired
5.6.2 Developmentenvironmentselection ‐ ‐ ‐ ‐
5.6.2.1 Y Y Y Expectedoutput
notrequired
5.6.2.2 Y Y Y Expectedoutput
notrequired
5.6.2.3 Y Y Y Y
5.7 Assessmentandimprovementprocess ‐ ‐ ‐ ‐
5.7.1 Processassessment Y Y Y N
5.7.2 Assessmentprocess ‐ ‐ ‐ ‐
5.7.2.1 Y Y Y N
5.7.2.2 Y Y Y N
5.7.2.3 Y Y Y N
5.7.2.4 Y Y Y N
5.7.3 Processimprovement ‐ ‐ ‐ ‐
5.7.3.1 Y Y Y N
5.7.3.2 Y Y Y N
5.7.3.3 Y Y Y N
6
‐ ‐ ‐ ‐
Softwareprocessassurance
6.1 Softwaredevelopmentlifecycle ‐ ‐ ‐ ‐
6.1.1 Lifecycledefinition Y Y Y Y
6.1.2 Qualityobjectives Y Y Y Y
6.1.3 Lifecycledefinitionreview Y Y Y Y
91
ECSSQST80C
6March2009
Clause Description A B C D
6.1.4 Lifecycleresources Y Y Y Y
6.1.5 Softwarevalidationprocessschedule Y Y Y Y
6.2 Requirementsapplicabletoallsoftware ‐ ‐ ‐ ‐
engineeringprocesses
6.2.1 Documentationofprocesses ‐ ‐ ‐ ‐
6.2.1.1 Y Y Y Y
6.2.1.2 Y Y Y Y
6.2.1.3 Y Y Y Y
6.2.1.4 Y Y Y Y
6.2.1.5 Y Y Y Y
6.2.1.6 Y Y Y Y
6.2.1.7 Y Y Y Y
6.2.1.8 Y Y Y Y
6.2.1.9 Y Y Y N
6.2.2 Softwaredependabilityandsafety ‐ ‐ ‐ ‐
6.2.2.1 Y Y Y Y
6.2.2.2 Y Y Y N
6.2.2.3 Y Y Y N
6.2.2.4 Y Y Y N
6.2.2.5 Y Y Y N
6.2.2.6 Y Y Y N
6.2.2.7 Y Y Y N
6.2.2.8 Y Y Y Y
6.2.2.9 Y Y Y Y
6.2.3 Handlingofcriticalsoftware ‐ ‐ ‐ ‐
6.2.3.1 Y Y Y Y
6.2.3.2 Y Y Y N
6.2.3.3 Y Y Y N
6.2.3.4 Y Y Y N
6.2.3.5 Y Y Y N
6.2.3.6 Y Y Y N
6.2.3.7 Y Y Y N
6.2.3.8 Y Y N N
6.2.4 Softwareconfigurationmanagement ‐ ‐ ‐ ‐
6.2.4.1 Y Y Y Y
6.2.4.2 Y Y Y Y
6.2.4.3 Y Y Y Y
6.2.4.4 Y Y Y Y
92
ECSSQST80C
6March2009
Clause Description A B C D
6.2.4.5 Y Y Y Y
6.2.4.6 Y Y Y Y
6.2.4.7 Y Y Y Y
6.2.4.8 Y Y Y Y
6.2.4.9 Y Y Y Y
6.2.4.10 Y Y Y Y
6.2.4.11 Y Y Y Y
6.2.5 Processmetrics ‐ ‐ ‐ ‐
6.2.5.1 Y Y Y Y
6.2.5.2 Y Y Y Y
6.2.5.3 Y Y Y Y
6.2.5.4 Y Y Y Limitedto
numberof
problems
detectedduring
validation
6.2.5.5 Y Y Y Y
6.2.6 Verification ‐ ‐ ‐ ‐
6.2.6.1 Y Y Y Y
6.2.6.2 Y Y Y Y
6.2.6.3 Y Y Y Y
6.2.6.4 Y Y Y Y
6.2.6.5 Y Y Y N
6.2.6.6 Y Y Y N
6.2.6.7 Y Y Y Y
6.2.6.8 Y Y Y Y
6.2.6.9 Y Y Y Y
6.2.6.10 Y Y Y Y
6.2.6.11 Y Y Y Y
6.2.6.12 Y Y Y Y
6.2.6.13 Y Y N N
6.2.7 Reuseofexistingsoftware ‐ ‐ ‐ ‐
6.2.7.1 Y Y Y Y
6.2.7.2 Y Y Y Y
6.2.7.3 Y Y Y Y
6.2.7.4 Y Y Y Bulletsc,d,eand
gnotapplicable.
Bulletblimitedto
architectural
design
93
ECSSQST80C
6March2009
Clause Description A B C D
6.2.7.5 Y Y Y Y
6.2.7.6 Y Y Y Y
6.2.7.7 Y Y Y Limitedtothe
extenttoensure
maintainabilityof
thesoftware
6.2.7.8 Y Y Y Limitedtothe
extenttoensure
maintainabilityof
thesoftware
6.2.7.9 Y Y Y Y
6.2.7.10 Y Y Y Y
6.2.7.11 Y Y Y Y
6.2.8 Automaticcodegeneration ‐ ‐ ‐ ‐
6.2.8.1 Y Y Y Y
6.2.8.2 Y Y Y Y
6.2.8.3 Y Y Y Y
6.2.8.4 Y Y Y Y
6.2.8.5 Y Y Y Y
6.2.8.6
Y Y Y Y
6.2.8.7
Y Y Y Y
6.3 Requirementsapplicabletoindividual
softwareengineeringprocessesoractivities
‐ ‐ ‐ ‐
6.3.1 Softwarerelatedsystemrequirements
process
‐ ‐ ‐ ‐
6.3.1.1 Y Y Y Y
6.3.1.2 Y Y Y Y
6.3.1.3 Y Y Y Y
6.3.2 Softwarerequirementsanalysis ‐ ‐ ‐ ‐
6.3.2.1 Y Y Y Y
6.3.2.2 Y Y Y Y
6.3.2.3
Y Y Y Y
6.3.2.4 Y Y Y Y
6.3.2.5 Y Y Y Y
6.3.3 Softwarearchitecturaldesignanddesignof
softwareitems
‐ ‐ ‐ ‐
6.3.3.1 Y Y Y Documentation
controlonly
6.3.3.2 Y Y Y Only
recommended
94
ECSSQST80C
6March2009
Clause Description A B C D
Y Y6.3.3.3 Y N
6.3.3.4 Y Y Y Onlyifdesign
standardsare
applied(6.3.2.2)
6.3.3.5 Y Y Y N
6.3.3.6 Y Y Y N
6.3.3.7 Y Y Y Y
6.3.4 Coding ‐ ‐ ‐ ‐
6.3.4.1 Y Y Y Y
6.3.4.2
Y Y Y Y
6.3.4.3
Y Y Y N
6.3.4.4
Y Y Y N
6.3.4.5
Y Y Y Y
6.3.4.6
Y Y Y Y
6.3.4.7
Y Y Y Y
6.3.4.8
Y Y Y Thecodeshallbe
putunder
configuration
controlatthe
beginningof
validationtesting
6.3.5 Testingandvalidation ‐ ‐ ‐ ‐
6.3.5.1
Y Y Y Noformalunit
testingand
integration
activityrequired
6.3.5.2
Y Y Y Noformalunit
testingand
integration
activityrequired
6.3.5.3
Y Y Y Testprocedures
anddataverified
bysample
6.3.5.4
Y Y Y Applicableto
validationand
acceptancetests
only
6.3.5.5
Y Y Y Y
6.3.5.6
Y Y Y Y
6.3.5.7
Y Y Y Y
6.3.5.8
Y Y Y Y
6.3.5.9
Y Y Y N
95
ECSSQST80C
6March2009
Clause Description A B C D
6.3.5.10
Y Y Y N
6.3.5.11
Y Y Y Y
6.3.5.12
Y Y Y Y
6.3.5.13
Y Y Y Y
6.3.5.14
Y Y Y Applicableto
validationand
acceptancetests
only
6.3.5.15
Y Y Y Y
6.3.5.16
Y Y Y Y
6.3.5.17
Y Y Y Y
6.3.5.18
Y Y Y Y
6.3.5.19
Y Y Y N
6.3.5.20
Y Y Y Y
6.3.5.21
Y Y Y Y
6.3.5.22
Y Y Y Y
6.3.5.23
Y Y Y Y
6.3.5.24
Y Y Y Y
6.3.5.25
Y Y Y Y
6.3.5.26
Y Y Y Y
6.3.5.27
Y Y Y Y
6.3.5.28
Y Y N N
6.3.5.29
Y Y Y Y
6.3.5.30
Y Y Y N
6.3.5.31
Y Y Y N
6.3.5.32
Y Y Y Y
6.3.6 Softwaredeliveryandacceptance ‐ ‐ ‐ ‐
6.3.6.1
Y Y Y Y
6.3.6.2
Y Y Y Y
6.3.6.3
Y Y Y Y
6.3.6.4
Y Y Y Y
6.3.6.5
Y Y Y Y
6.3.6.6
Y Y Y Y
6.3.6.7
Y Y Y Y
6.3.6.8
Y Y Y Y
6.3.6.9
Y Y Y Y
6.3.7 Operations ‐ ‐ ‐ ‐
6.3.7.1
Y Y Y Y
96
ECSSQST80C
6March2009
Clause Description A B C D
6.3.7.2 Y Y Bulleton
safety
features
not
applicable
Bulletonsafety
featuresnot
applicable
6.3.7.3
Y Y Y Y
6.3.8 Maintenance ‐ ‐ ‐ ‐
6.3.8.1
Y Y Y Y
6.3.8.2
Y Y Y Y
6.3.8.3
Y Y Y Y
6.3.8.4
Y Y Y Y
6.3.8.5
Y Y Y Y
6.3.8.6
Y Y Y Y
6.3.8.7
Y Y Y Statisticaldata
notcollected
7 Softwareproductqualityassurance
‐ ‐ ‐ ‐
7.1 Productqualityobjectivesandmetrication ‐ ‐ ‐ ‐
7.1.1 Derivingofrequirements Y Y Y Y
7.1.2 Quantitativedefinitionofquality
requirements
Y Y Y Y
7.1.3 Assuranceactivitiesforproductquality
requirements
Y Y Y Y
7.1.4 Productmetrics Y Y Y Bulletd.1not
applicable
7.1.5 Basicmetrics Y Y Y Designrelevant
andfault
density/failure
intensitymetrics
notrequired
7.1.6 Reportingofmetrics Y Y Y Y
7.1.7 Numericalaccuracy Y Y Y Y
7.1.8 Analysisofsoftwarematurity Y Y Y N
7.2 Productqualityrequirements ‐ ‐ ‐ ‐
7.2.1 Requirementsbaselineandtechnical
specification
‐ ‐ ‐ ‐
7.2.1.1
Y Y Y Y
7.2.1.2
Y Y Y Y
7.2.1.3
Y Y Y Y
7.2.2 Designandrelateddocumentation ‐ ‐ ‐ ‐
7.2.2.1
Y Y Y Y
7.2.2.2
Y Y Y Y
97
ECSSQST80C
6March2009
Clause Description A B C D
7.2.2.3
Y Y Y Y
7.2.3 Testandvalidationdocumentation ‐ ‐ ‐ ‐
7.2.3.1
Y Y Y Y
7.2.3.2
Y Y Y Y
7.2.3.3
Y Y Y Y
7.2.3.4
Y Y Y Y
7.2.3.5
Y Y Y Y
7.2.3.6
Y Y Y Y
7.3 Softwareintendedforreuse ‐ ‐ ‐ ‐
7.3.1 Customerrequirements Y Y Y Y
7.3.2 Separatedocumentation Y Y Y Y
7.3.3 Selfcontainedinformation Y Y Y Y
7.3.4 Requirementsforintendedreuse Y Y Y Y
7.3.5 Configurationmanagementforintended
reuse
Y Y Y Y
7.3.6 Testingondifferentplatforms Y Y Y Y
7.3.7 Certificateofconformance Y Y Y Y
7.4 Standardhardwareforoperationalsystem ‐ ‐ ‐ ‐
7.4.1 Hardwareprocurement Y Y Y Y
7.4.2 Serviceprocurement Y Y Y Y
7.4.3 Constraints Y Y Y Y
7.4.4 Selection Y Y Y Y
7.4.5 Maintenance Y Y Y Y
7.5 Firmware ‐ ‐ ‐ ‐
7.5.1 Deviceprogramming Y Y Y Y
7.5.2 Marking Y Y Y Y
7.5.3 Calibration Y Y Y Y
98
ECSSQST80C
6March2009
Annex E (informative)
List of requirements with built-in tailoring
capability
The following requirements are applicable under specific conditions, as
describedintherequirement’stext.
Thesoftwareproductassurancemanager/engineershallreportto
theprojectmanager(throughtheprojectproductassurance
manager,ifany)
5.1.4.2
5.2.2.1 Thesuppliershallreportonaregularbasisonthestatusofthe
softwareproductassuranceprogrammeimplementation,if
appropriateaspartoftheoverallproductassurancereportingof
theproject.
Incaseofminorchangesintoolsthataffectthegenerationofthe
executablecode,abinarycomparisonoftheexecutablecode
generatedbythedifferenttoolscanbeusedtoverifythatno
modificationsareintroduced
6.2.3.4
Thisrequirementisapplicablewheretherisksassociatedwiththe
projectjustifythecostsinvolved.Thecustomermayconsidera
lessrigorouslevelofindependence,e.g.anindependentteamin
thesameorganization.
6.2.6.13
The following requirements foresee an agreement between the customer and
thesupplier.
6.3.2.5 Priortothetechnicalspecificationelaboration,customerand
suppliershallagreeonthefollowingprinciplesandrulesasa
minimum:[…].
6.3.5.2 Basedonthecriticalityofthesoftware,testcoveragegoalsfor
eachtestinglevelshallbeagreedbetweenthecustomerandthe
supplierandtheirachievementmonitoredbymetrics:[…].
99
ECSSQST80C
6March2009
Annex F (informative)
Document organization and content at
each milestone
F.1 Introduction
The following table shows the organization of the Expected Output of the
clauses of this Standard, sorted per review, then per destination file, then per
DRD.
WhennoDRDisavailable,isshown.
F.2 ECSS-Q-ST-80 Expected Output at SRR
Clause ExpectedOutput Dest.File DRD Section
7.1.1.a Requirementbaseline RB SSS <5.9>
7.1.2.a Requirementbaseline RB SSS <5.9>
7.2.1.1.a Requirementbaseline RB SSS <5.9>
7.2.1.3.a Requirementbaseline RB SSS <5.1>
5.4.4 Safetyanddependabilityanalyses
resultsforlowerlevelsuppliers
RB‐
5.1.2.1 Softwareproductassuranceplan PAF SPAP <5.1>
5.1.2.2 Softwareproductassuranceplan PAF SPAP <5.1>,<5.2>
5.1.2.3 Softwareproductassuranceplan PAF SPAP <5.1>
5.1.3.1 Softwareproductassuranceplan PAF SPAP <5.3>
5.1.3.2 Softwareproductassuranceplan PAF SPAP <5.1>,<5.3>
5.1.4.1 Softwareproductassuranceplan PAF SPAP <5.1>,<5.3>
5.2.1.1 Softwareproductassuranceplan PAF SPAP All
5.2.1.5 Softwareproductassuranceplan PAF SPAP <8>
5.2.6.1.a.a NCRSWprocedureaspartofthe
Softwareproductassuranceplan
PAF SPAP
5.2.6.2. Softwareproductassuranceplan PAF SPAP <6.4>
5.6.1.1 Softwareproductassuranceplan PAF SPAP <5.8>
6.1.1 Softwareproductassuranceplan PAF SPAP <6.1>
100
ECSSQST80C
6March2009
Clause ExpectedOutput Dest.File DRD Section
6.1.5 Softwareproductassuranceplan PAF SPAP <6.1>
6.2.1.4 Softwareproductassuranceplan PAF SPAP <6.2>
6.2.4.8.a Softwareproductassuranceplan PAF SPAP <6.4>
6.2.4.9 Softwareproductassuranceplan PAF SPAP <6.4>
6.2.4.11.a Softwareproductassuranceplan PAF SPAP <6.4>
6.2.5.1 Softwareproductassuranceplan PAF SPAP <6.5>
6.2.5.2 Softwareproductassuranceplan PAF SPAP <6.5>
6.2.7.2.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
6.2.7.3.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
6.2.7.4.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
6.2.7.5.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
7.1.3 Softwareproductassuranceplan PAF SPAP <7>
7.1.4 Softwareproductassuranceplan PAF SPAP <7>
7.1.5 Softwareproductassuranceplan PAF SPAP <7>
7.2.2.3.a Softwareproductassuranceplan PAF SPAP <6.7>
5.2.2.3 Softwareproductassurance
milestonereport
PAF SPAMR All
5.6.1.2 Softwareproductassurance
milestonereport
PAF SPAMR <5>
6.2.6.12 Softwareproductassurance
milestonereport
PAF SPAMR <4>
5.2.3 Auditplanandschedule PAF‐
5.4.2.1 Softwareproductassurance
requirementsforsuppliers
PAF‐
5.4.2.2 Softwareproductassurance
requirementsforsuppliers
PAF‐
6.2.2.1 Criticalityclassificationofsoftware
products
PAF‐
6.2.8.4 Modellingstandards PAF‐
6.3.3.2 Designstandards PAF‐
7.4.1.b Receivinginspectionreport PAF‐
5.5.2 Softwaredevelopmentplan MGT SDP <4.8>
5.6.2.1 Softwaredevelopmentplan MGT SDP <5.4>
5.6.2.2 Softwaredevelopmentplan MGT SDP <5.4>
6.2.4.2 Softwareconfiguration
managementplan
MGT SCMP
101
ECSSQST80C
6March2009
Clause ExpectedOutput Dest.File DRD Section
7.3.5 Configurationmanagementfor
reusablecomponents
MGT SCMP
5.1.5.1 Trainingplan MGT‐
5.2.6.1.b IdentificationofSWexpertsinNRBMGT‐
5.5.3 Procurementdata MGT‐
6.2.7.2.b Softwarereusefile DJF SRF <6>
6.2.7.3.b Softwarereusefile DJF SRF <4>,<5>
6.2.7.4.b Softwarereusefile DJF SRF <5>
6.2.7.5.b Softwarereusefile DJF SRF <6>
6.2.7.6 Softwarereusefile DJF SRF <4>,<5>
6.2.7.7 Softwarereusefile DJF SRF <8>
6.2.7.8 Softwarereusefile DJF SRF <8>
6.2.7.11 Softwarereusefile DJF SRF <9>
6.2.6.4 Softwareproblemreports DJF‐
6.2.6.13.a ISVVplan DJF‐
6.3.5.8 Softwareproblemreports DJF‐
6.3.5.28.a ISVVplan DJF‐
7.4.1.a Justificationofselectionof
operationalgroundequipment
DJF‐
7.4.2 Justificationofselectionof
operationalsupportservices
DJF‐
7.4.3 Justificationofselectionof
operationalgroundequipment
DJF‐
7.4.4 Justificationofselectionof
operationalgroundequipment
DJF‐
F.3 ECSS-Q-ST-80 Expected Output at PDR
Clause Expectedoutput Dest.File DRD Section
6.3.2.4 Softwarerequirementsspecification TS SRS <5>
7.1.1.b Technicalspecification TS SRS <5.10>
7.1.2.b Technicalspecification TS SRS <5.10>
7.2.1.1.b Technicalspecification TS SRS <5.10>
7.2.1.3.b Technicalspecification TS SRS <6>
7.3.4 Technicalspecificationforreusable
components
TS‐
5.2.1.1 Softwareproductassuranceplan PAF SPAP All
5.2.1.5 Softwareproductassuranceplan PAF SPAP <8>
102
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
5.2.6.2. Softwareproductassuranceplan PAF SPAP <6.4>
5.2.7.1 Softwareproductassuranceplan PAF SPAP <5.5>
5.2.7.2 Softwareproductassuranceplan PAF SPAP <5.5>
5.4.3.3 Nextlevelsuppliers’software
productassuranceplan
PAF SPAP All
5.4.3.4 Nextlevelsuppliers’software
productassuranceplan
PAF SPAP All
5.6.1.1 Softwareproductassuranceplan PAF SPAP <5.8>
6.1.1 Softwareproductassuranceplan PAF SPAP <6.1>
6.1.5 Softwareproductassuranceplan PAF SPAP <6.1>
6.2.1.4 Softwareproductassuranceplan PAF SPAP <6.2>
6.2.3.1.a Softwareproductassuranceplan PAF SPAP <6.3>
6.2.3.2 Softwareproductassuranceplan PAF SPAP <6.3>
6.2.3.4 Softwareproductassuranceplan PAF SPAP <6.7>
6.2.3.5 Softwareproductassuranceplan PAF SPAP <6.7>
6.2.4.8.a Softwareproductassuranceplan PAF SPAP <6.4>
6.2.4.9 Softwareproductassuranceplan PAF SPAP <6.4>
6.2.4.11.a Softwareproductassuranceplan PAF SPAP <6.4>
6.2.5.1 Softwareproductassuranceplan PAF SPAP <6.5>
6.2.5.2 Softwareproductassuranceplan PAF SPAP <6.5>
6.2.7.2.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
6.2.7.3.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
6.2.7.4.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
6.2.7.5.a Softwarereuseapproach,including
approachtodeltaqualification
PAF SPAP <6.6>
6.3.3.3 Softwareproductassuranceplan PAF SPAP <6.8>
6.3.3.5 Softwareproductassuranceplan PAF SPAP <6.7>
6.3.3.7.a Softwareproductassuranceplan PAF SPAP <6.7>
6.3.4.3 Softwareproductassuranceplan PAF SPAP <6.8>
6.3.4.6 Softwareproductassuranceplan PAF SPAP <6.8>
6.3.5.1 Softwareproductassuranceplan PAF SPAP <6.7>
6.3.5.2 Softwareproductassuranceplan PAF SPAP <6.7>
7.1.3 Softwareproductassuranceplan PAF SPAP <7>
7.1.4 Softwareproductassuranceplan PAF SPAP <7>
7.1.5 Softwareproductassuranceplan PAF SPAP <7>
7.2.2.3.a Softwareproductassuranceplan PAF SPAP <6.7>
103
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
7.5.1 Softwareproductassuranceplan PAF SPAP <6.8>
7.5.2 Softwareproductassuranceplan PAF SPAP <6.8>
5.2.2.3 Softwareproductassurance
milestonereport
PAF SPAMR All
5.6.1.2 Softwareproductassurance
milestonereport
PAF SPAMR <5>
6.2.3.3 Softwareproductassurance
milestonereport
PAF SPAMR <4>
6.2.6.12 Softwareproductassurance
milestonereport
PAF SPAMR <4>
5.2.5.1 Softwareproblemreporting
procedures
PAF‐
5.2.5.2 Softwareproblemreporting
procedures
PAF‐
5.2.5.3 Softwareproblemreporting
procedures
PAF‐
5.5.5 Receivinginspectionreport PAF‐
6.2.1.6 Proceduresandstandards PAF‐
6.2.1.7 Proceduresandstandards PAF‐
6.2.2.1 Criticalityclassificationofsoftware
products
PAF‐
6.2.2.2 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.2.3 Criticalityclassificationofsoftware
components
PAF‐
6.2.2.7 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.3.1.b Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.8.4 Modellingstandards PAF‐
6.3.3.2 Designstandards PAF‐
6.3.4.1 Codingstandards PAF‐
6.3.4.2 Codingstandards PAF‐
6.3.4.4 Codingstandardsanddescription
oftools
PAF‐
7.4.1.b Receivinginspectionreport PAF‐
5.5.2 Softwaredevelopmentplan MGT SDP <4.8>
5.6.2.1 Softwaredevelopmentplan MGT SDP <5.4>
5.6.2.2 Softwaredevelopmentplan MGT SDP <5.4>
6.3.4.5 Softwaredevelopmentplan MGT SDP <5.4>
104
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
6.2.4.2 Softwareconfiguration
managementplan
MGT SCMP
7.3.5 Configurationmanagementfor
reusablecomponents
MGT SCMP
5.5.3 Procurementdata MGT‐
7.1.7 Numericalaccuracyanalysis DJF SVR <6>
6.2.6.1 Softwareverificationplan DJF SVerP <6.3>
6.2.8.2 Validationandtesting
documentation
DJF SValP <4.1>
6.2.8.7 Validationandtesting
documentation
DJF SValP <4.1>
6.3.5.22 Testandvalidationdocumentation DJF SValP <4>
6.3.5.23 Testandvalidationdocumentation DJF SValP <4.4>
6.3.5.24 Testandvalidationdocumentation DJF SValP <4.6>
6.3.5.25 Testandvalidationdocumentation DJF SValP <5>
6.3.5.29 Testandvalidationdocumentation DJF SValP <6>
6.2.8.2 Validationandtesting
documentation
DJF SUITP <7.6>
6.2.8.7 Validationandtesting
documentation
DJF SUITP <7.6>
6.3.5.22 Testandvalidationdocumentation DJF SUITP <5>
6.3.5.23 Testandvalidationdocumentation DJF SUITP <5.3>
6.3.5.24 Testandvalidationdocumentation DJF SUITP <5.5>
6.3.5.25 Testandvalidationdocumentation DJF SUITP <9.2>,<10>
6.2.7.2.b Softwarereusefile DJF SRF <6>
6.2.7.3.b Softwarereusefile DJF SRF <4>,<5>
6.2.7.4.b Softwarereusefile DJF SRF <5>
6.2.7.5.b Softwarereusefile DJF SRF <6>
6.2.7.6 Softwarereusefile DJF SRF <4>,<5>
6.2.7.7 Softwarereusefile DJF SRF <8>
6.2.7.8 Softwarereusefile DJF SRF <8>
6.2.7.11 Softwarereusefile DJF SRF <9>
6.2.6.4 Softwareproblemreports DJF‐
6.2.6.13.a ISVVplan DJF‐
6.2.6.13.a ISVVreport DJF‐
6.3.5.8 Softwareproblemreports DJF‐
6.3.5.28.a ISVVplan DJF‐
6.3.5.28.b ISVVreport DJF‐
7.4.1.a Justificationofselectionof DJF‐
105
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
operationalgroundequipment
7.4.2 Justificationofselectionof
operationalsupportservices
DJF‐
7.4.3 Justificationofselectionof
operationalgroundequipment
DJF‐
7.4.4 Justificationofselectionof
operationalgroundequipment
DJF‐
7.2.2.3.b Justificationofdesignchoices DDF SDD <4.5>
F.4 ECSS-Q-ST-80 Expected Output at CDR
Clause Expectedoutput Dest.File DRD Section
5.2.1.3 Softwareproductassuranceplan PAF SPAP All
6.2.3.1.a Softwareproductassuranceplan PAF SPAP <6.3>
6.2.3.2 Softwareproductassuranceplan PAF SPAP <6.3>
6.2.3.4 Softwareproductassuranceplan PAF SPAP <6.7>
6.2.3.5 Softwareproductassuranceplan PAF SPAP <6.7>
5.2.2.3 Softwareproductassurance
milestonereport
PAF SPAMR All
6.2.3.3 Softwareproductassurance
milestonereport
PAF SPAMR <4>
6.2.6.12 Softwareproductassurance
milestonereport
PAF SPAMR <4>
5.5.5 Receivinginspectionreport PAF‐
6.2.2.5 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.2.6 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.2.7 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.3.1.b Softwaredependabilityandsafety
analysisreport
PAF‐
6.3.5.7 Statementofcompliancewithtest
plansandprocedures
PAF‐
6.3.5.11 Statementofcompliancewithtest
plansandprocedures
PAF‐
6.2.8.2 Validationandtesting
documentation
DJF SVS <5.6>
6.2.8.7 Validationandtesting
documentation
DJF SVS <5.6>
106
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
6.3.5.25 Testandvalidationdocumentation DJF SVS <7.2>,<8>
6.3.5.29 Testandvalidationdocumentation DJF SVS <6>
6.3.5.32 Softwarevalidationspecification DJF SVS <5>
6.2.6.5 Softwareverificationreport DJF SVR <4.4>
6.2.6.6 Softwareverificationreport DJF SVR <4.4>
7.1.7 Numericalaccuracyanalysis DJF SVR <6>
7.2.3.6 Softwareverificationreport DJF SVR <4.5>
6.2.8.2 Validationandtesting
documentation
DJF SUITP <7.6>
6.2.8.7 Validationandtesting
documentation
DJF SUITP <7.6>
6.3.5.22 Testandvalidationdocumentation DJF SUITP <5>
6.3.5.23 Testandvalidationdocumentation DJF SUITP <5.3>
6.3.5.24 Testandvalidationdocumentation DJF SUITP <5.5>
6.3.5.25 Testandvalidationdocumentation DJF SUITP <9.2>,<10>
6.2.7.9 Softwarereusefile DJF SRF <8>
6.2.7.11 Softwarereusefile DJF SRF <9>
6.2.6.4 Softwareproblemreports DJF‐
6.2.6.13.a ISVVreport DJF‐
6.3.5.6 Nonconformancereportsand
softwareproblemreports
DJF‐
6.3.5.8 Softwareproblemreports DJF‐
6.3.5.13 Testingandvalidationreports DJF‐
6.3.5.16 Updatedtestdocumentation DJF‐
6.3.5.17 Updatedtestdocumentation DJF‐
6.3.5.18 Updatedtestdocumentation DJF‐
6.3.5.28.b ISVVreport DJF‐
6.3.5.30 Testingandvalidationreports DJF‐
6.3.5.31 Testingandvalidationreports DJF‐
7.3.6 Verificationandvalidation
documentationforreusable
components
DJF‐
7.3.7 Verificationandvalidation
documentationforreusable
components
DJF‐
6.2.4.4 Softwareconfigurationfile DDF SCF All
6.2.4.5 Softwareconfigurationfile DDF SCF All
6.2.4.8.b Softwareconfigurationfile DDF SCF All
7.2.2.3.b Justificationofdesignchoices DDF SDD <4.5>
107
ECSSQST80C
6March2009
F.5 ECSS-Q-ST-80 Expected Output at QR
Clause Expectedoutput Dest.File DRD Section
5.2.1.3 Softwareproductassuranceplan PAF SPAP All
5.2.2.3 Softwareproductassurance
milestonereport
PAF SPAMR All
6.2.3.3 Softwareproductassurance
milestonereport
PAF SPAMR <4>
6.2.6.12 Softwareproductassurance
milestonereport
PAF SPAMR <4>
5.5.5 Receivinginspectionreport PAF‐
6.2.2.5 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.2.6 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.3.1.b Softwaredependabilityandsafety
analysisreport
PAF‐
6.3.5.7 Statementofcompliancewithtest
plansandprocedures
PAF‐
6.3.5.11 Statementofcompliancewithtest
plansandprocedures
PAF‐
6.3.8.1 Maintenanceplan MF‐
6.3.8.2 Maintenanceplan MF‐
6.3.8.4 Maintenanceplan MF‐
6.3.8.5 Maintenanceplan MF‐
6.2.8.2 Validationandtesting
documentation
DJF SVS <5.6>
6.2.8.7 Validationandtesting
documentation
DJF SVS <5.6>
6.3.5.25 Testandvalidationdocumentation DJF SVS <7.2>,<8>
6.3.5.29 Testandvalidationdocumentation DJF SVS <6>
6.3.5.32 Softwarevalidationspecification DJF SVS <5>
6.2.6.5 Softwareverificationreport DJF SVR <4.4>
6.2.6.6 Softwareverificationreport DJF SVR <4.4>
7.1.7 Numericalaccuracyanalysis DJF SVR <6>
7.2.3.6 Softwareverificationreport DJF SVR <4.5>
6.2.7.9 Softwarereusefile DJF SRF <8>
6.2.7.11 Softwarereusefile DJF SRF <9>
6.2.6.4 Softwareproblemreports DJF‐
108
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
6.2.6.13.a ISVVreport DJF‐
6.3.5.6 Nonconformancereportsand
softwareproblemreports
DJF‐
6.3.5.8 Softwareproblemreports DJF‐
6.3.5.13 Testingandvalidationreports DJF‐
6.3.5.16 Updatedtestdocumentation DJF‐
6.3.5.17 Updatedtestdocumentation DJF‐
6.3.5.18 Updatedtestdocumentation DJF‐
6.3.5.28.b ISVVreport DJF‐
6.3.5.30 Testingandvalidationreports DJF‐
6.3.5.31 Testingandvalidationreports DJF‐
6.2.4.4 Softwareconfigurationfile DDF SCF All
6.2.4.5 Softwareconfigurationfile DDF SCF All
6.2.4.8.b Softwareconfigurationfile DDF SCF All
F.6 ECSS-Q-ST-80 Expected Output at AR
Clause Expectedoutput Dest.File DRD Section
5.2.1.3 Softwareproductassuranceplan PAF SPAP All
5.2.1.4 Softwareproductassuranceplan PAF SPAP <5.10>
5.2.2.3 Softwareproductassurance
milestonereport
PAF SPAMR All
6.2.3.3 Softwareproductassurance
milestonereport
PAF SPAMR <4>
6.2.6.12 Softwareproductassurance
milestonereport
PAF SPAMR <4>
6.2.2.5 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.2.6 Softwaredependabilityandsafety
analysisreport
PAF‐
6.2.3.1.b Softwaredependabilityandsafety
analysisreport
PAF‐
6.3.5.7 Statementofcompliancewithtest
plansandprocedures
PAF‐
6.3.5.11 Statementofcompliancewithtest
plansandprocedures
PAF‐
6.3.8.1 Maintenanceplan MF‐
6.3.8.2 Maintenanceplan MF‐
109
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
6.3.8.4 Maintenanceplan MF‐
6.3.8.5 Maintenanceplan MF‐
6.2.8.2 Validationandtesting
documentation
DJF SVS <5.6>
6.2.8.7 Validationandtesting
documentation
DJF SVS <5.6>
6.3.5.25 Testandvalidationdocumentation DJF SVS <7.2>,<8>
6.3.5.29 Testandvalidationdocumentation DJF SVS <6>
6.3.5.32 Softwarevalidationspecification DJF SVS <5>
6.2.6.5 Softwareverificationreport DJF SVR <4.4>
6.2.6.6 Softwareverificationreport DJF SVR <4.4>
7.2.3.6 Softwareverificationreport DJF SVR <4.5>
6.2.7.9 Softwarereusefile DJF SRF <8>
6.2.7.11 Softwarereusefile DJF SRF <9>
6.2.6.4 Softwareproblemreports DJF‐
6.2.6.13.a ISVVreport DJF‐
6.3.5.6 Nonconformancereportsand
softwareproblemreports
DJF‐
6.3.5.8 Softwareproblemreports DJF‐
6.3.5.13 Testingandvalidationreports DJF‐
6.3.5.16 Updatedtestdocumentation DJF‐
6.3.5.17 Updatedtestdocumentation DJF‐
6.3.5.18 Updatedtestdocumentation DJF‐
6.3.5.27 Testandvalidationdocumentation DJF‐
6.3.5.28.b ISVVreport DJF‐
6.3.5.30 Testingandvalidationreports DJF‐
6.3.5.31 Testingandvalidationreports DJF‐
6.3.6.3 Acceptancetestplan DJF‐
6.3.6.7 Nonconformancereports DJF‐
6.3.6.8 Acceptancetestreport DJF‐
6.3.6.9 Acceptancetestreport DJF‐
6.2.4.4 Softwareconfigurationfile DDF SCF All
6.2.4.5 Softwareconfigurationfile DDF SCF All
6.2.4.8.b Softwareconfigurationfile DDF SCF All
6.3.6.1 Installationprocedure DDF SCF <4.2>
110
ECSSQST80C
6March2009
F.7 ECSS-Q-ST-80 Expected Output not associated with
any specific milestone review
Clause Expectedoutput Dest.File DRD Section
5.1.5.2 Recordsoftrainingandexperience PAF‐
5.2.2.1 Softwareproductassurancereport PAF‐
5.2.2.2 Softwareproductassurancereport PAF‐
5.2.4.a Preliminaryalertinformation PAF‐
5.2.4.b Alertinformation PAF‐
5.4.1.1.a Resultsofpreawardauditsand
assessments
PAF‐
5.4.1.1.b Recordsofprocurementsources PAF‐
5.6.1.3 Softwareproductassurancereports PAF‐
5.7.1 Softwareprocessassessment
records:Overallassessmentsand
improvementprogrammeplan
PAF‐
5.7.2.1.a Softwareprocessassessmentrecord:
assessmentmodel
PAF‐
5.7.2.1.b Softwareprocessassessmentrecord:
assessmentmethod
PAF‐
5.7.2.2.a Softwareprocessassessmentrecord:
evidenceofconformanceofthe
processassessmentmodel
PAF‐
5.7.2.2.b Softwareprocessassessmentrecord:
assessmentmethod
PAF‐
5.7.2.3 Softwareprocessassessmentrecord:
Softwareprocessassessment
recognitionevidence
PAF‐
5.7.2.4 Softwareprocessassessmentrecord:
competentassessorjustification
PAF‐
5.7.3.1 Softwareprocessassessment
records:improvementplan
PAF‐
5.7.3.2 Softwareprocessassessment
records:improvementprocess
PAF‐
5.7.3.3 Softwareprocessassessment
records:evidenceofimprovements
PAF‐
6.2.5.4 Softwareproductassurancereports PAF‐
6.2.5.5 Softwareproductassurancereports PAF‐
6.2.6.2 Softwareproductassurancereports PAF‐
6.2.6.3 Softwareproductassurancereports PAF‐
6.2.6.7 Softwareproductassurancereports PAF‐
6.2.6.9 Reviewandinspectionplansor
procedures
PAF‐
111
ECSSQST80C
6March2009
Clause Expectedoutput Dest.File DRD Section
6.2.6.10 Reviewandinspectionplansor
procedures
PAF‐
6.2.6.11 Reviewandinspectionrecords PAF‐
6.2.8.5 Softwareproductassurancereports PAF‐
6.3.3.4 Softwareproductassurancereports PAF‐
6.3.3.6 Softwareproductassurancereports PAF‐
6.3.3.7.b Softwareproductassurancereports PAF‐
6.3.4.7 Softwareproductassurancereports PAF‐
6.3.5.3 Softwareproductassurancereports PAF‐
6.3.5.5 Softwareproductassurancereports PAF‐
6.3.5.12 Softwareproductassurancereports PAF‐
7.1.6 Softwareproductassurancereports PAF‐
7.1.7 Softwareproductassurancereports PAF‐
6.3.8.6 Maintenancerecords MF‐
6.3.8.7 Maintenancerecords MF‐
5.2.6.1.a.b Nonconformancereports DJF‐
5.4.1.2 Softwarereusefile DJF SRF All
6.2.4.3.a Softwareconfigurationfile DDF SCF All
6.2.4.3.b Softwarereleasedocument DDF SRelD All
6.2.4.10 Softwareconfigurationfile DDF SCF All
6.2.4.11.b Labels DDF‐
112
ECSSQST80C
6March2009
Bibliography
ECSSSST00 ECSSsystemDescription,implementationand
generalrequirement
ECSSQHB8002
SpaceproductassuranceSoftwareprocess
assessmentandimprovement
ECSSQHB8003
SpaceproductassuranceSoftware
dependabilityandsafetymethodsandtechniques
ECSSQHB8004 SpaceproductassuranceSoftwaremetrication
programmedefinitionandimplementation
ECSSQST3002 SpaceproductassuranceFailuremodes,effects
(andcriticality)analysis
IEEE610.12:1990 IEEEStandardGlossaryofsoftwareengineering
terminology
IEEE10281997
IEEEStandardforSoftwareReviews
ISO9000:2000 QualitymanagementsystemsFundamentals
andvocabulary
ISO91261:2001 SoftwareengineeringProductqualityPart1:
Qualitymodel
ISO/IEC12207:1995 InformationtechnologySoftwarelifecycle
processes
ISO/IEC15504:1998 InformationtechnologySoftwareprocess
assessment
RTCA/DO178B Softwareconsiderationsinairbornesystemsand
equipmentcertification
CMU/SEI92TR022 SoftwareQualityMeasurement:Aframeworkfor
countingproblemsanddefects
CMU/SEI2006TR008 CMMIforDevelopment,Version1.2
113